• @[email protected]
    link
    fedilink
    English
    223 months ago

    Not everyone will use it. Many right wing types spreading divisive or incorrect information, for example.

    • @[email protected]
      link
      fedilink
      63 months ago

      I’ve also seen boobies that were not flagged NSFW. I didn’t throw my hands up and just delete my NSFW filter though.

    • @[email protected]
      link
      fedilink
      23 months ago

      I was going to suggest community tagging. But then I remembered that’s how Steam works and their game tags are so stupid. So many “souls like” but not really.

      • Pika
        link
        fedilink
        English
        13 months ago

        so many “multiplayer” ones that are strictly single player as well.

        Or steamplay compatible with MP tag but they expected you to just give control to your friend and watch as it’s SP only

  • @[email protected]
    link
    fedilink
    English
    783 months ago

    Slop is insulting. If I take the time to read it, I want another human to have taken the time to write it.

    • @[email protected]
      link
      fedilink
      English
      123 months ago

      The counter there is to have an AI summarize it. No time taken to write nor to read haha

      • @[email protected]
        link
        fedilink
        English
        43 months ago

        That would work if it didn’t get even that wrong a huge amount of time. There are entire subreddits dedicated to AI summary fails!

        • @[email protected]
          link
          fedilink
          23 months ago

          Cherry picked and edited to give bad answers. Go play around with any of the big models, you’ll be bored and disappointed because 99.9% of the time it will give you exactly what you ask for.

          Except Gemini. Gemini is a drunk.

          • @[email protected]
            link
            fedilink
            English
            13 months ago

            Yes, I’m sure the people posting funny AI summary fails to laugh at with others have an agenda and are all doctoring their screenshots…

      • @[email protected]
        link
        fedilink
        English
        13 months ago

        I foresee a future where we have an AI layer on top of corporate emails, translating from English to corpo-speak and back to English again.

    • @[email protected]
      link
      fedilink
      23 months ago

      Agree to an extent. Its a tool that can aid talentless folk like myself shitpost, and has its place. But I agree with tags and disagree with inundating forums and stealing ip

  • @[email protected]
    link
    fedilink
    133 months ago

    Sure. Only problem is, it’s a people issue. Some people making ai generated content may be honest and willing to abide to such rule, but most are proud to not even read the rules and just blast shitty slop left and right. For this second category of people, when you point it to them, a very small percentage of them goes “oh, sorry”. The vast majority just keep posting until blocked.

    Granted, this experience mostly stems from every media posting sites out there, so it may be a bit biased…

  • @[email protected]
    link
    fedilink
    143 months ago

    It should be fineable starting at like 500 dollars + any profits and ad revenue if its not labelled

  • @[email protected]
    link
    fedilink
    283 months ago

    I am in complete agreement with this. While you can currently tell what’s AI it won’t be long before we’re scratching our heads wondering which way is up and which way is down. Hell, I saw an AI generated video of a cat cooking food. It looked real sortve.

  • @[email protected]
    link
    fedilink
    English
    63 months ago

    A lot of people seem to think that all ai art is low effort garbage, which is just not true. There can be a lot of skill put into crafting the correct prompt to get the image you want from an image generator, not to mention the technical know-how of setting it up locally. The “ai art is not art” argument to me doesn’t sound any more substantiated than “electronic musicians aren’t musicians, go learn a real instrument” or “photographers aren’t really artists, all they do is push a button”. But regardless, I agree that we need good tagging, or as @ThatWeirdGuy1001 said, different communities. Even though the output looks similar, actually drawing things and wrangling prompts are two completely different skillsets, and the way we engage with the artistic product of those skills is completely different. You wouldn’t submit a photo you took to a watercolor painting contest. Same with ai art and non-ai art.

    Anyway, just thought i’d share my opinion as an ai non-hater.

    • @[email protected]
      link
      fedilink
      English
      5
      edit-2
      3 months ago

      For art to be art you need space to express yourself through individual choices:

      • play an original song on a real instrument, and you have the entire artistic spectrum to yourself
      • if you make the music for it out of individual pieces, you narrow that range. The sounds are not yours, only their composition and words
      • when you record a cover of a rap song over some elses beat, you further narrow it down to your performance only. Its still artistic expression, but to a much less degree than an original song

      In a prompt generated image, the image itself is not your expression. The prompt is, but comparing the amount of choices you need to make with a painting over a prompt, its just so… less art?

      • @[email protected]
        link
        fedilink
        3
        edit-2
        3 months ago

        In digital art, the image itself is not your expression. The idea is, but comparing the choice of shaders you use with brush strokes done with real paint, where you can see and feel the emotions the artist wanted to express with their physical brush, it’s just so… less art?

  • Uriel238 [all pronouns]
    link
    fedilink
    English
    43 months ago

    I think that would be a great Lemmy feature that puts an AI content warning when it is so tagged, similar to how it blurs images tagged NSFW.

  • fmstrat
    link
    fedilink
    English
    123 months ago

    Adobe is trying for the opposite. Content authenticity with digital signatures to show something is not AI (been having conversations with them on this).

    • @[email protected]
      link
      fedilink
      23 months ago

      Very nice idea in theory, but proving there is no AI involved in the creation of art is not something I think is remotely possible. It’s an arms race more than anything, but I’m very interested in how Adobe will tackle it. I think people will be appreciating physical art more again, but even then we could argue about the usage of AI tools.

      Anyhow, people will have to come to terms with the fact that AI is here to stay, and will only get better too.

      • fmstrat
        link
        fedilink
        English
        23 months ago

        My other reply talks about how this works with cryptographic signatures, but sure, people can lie. The key to this method is if there is a signature from a reputable artist, news org, or photographer, then that origin can’t be forged. So it’s about proving the authenticity (origin) vs the negative use of AI.

        • @[email protected]
          link
          fedilink
          13 months ago

          Pretty cool indeed, thank you. I like the idea of a cryptographic certificate of authenticity, would definitely add value to the digital art world.

    • JustEnoughDucks
      link
      fedilink
      13 months ago

      And being adobe, they will put a nice little backdoor in it for them to change the credentials so that they can take artists’ work and use it, train their AI with it, and sell it like they have been doing for years.

      • fmstrat
        link
        fedilink
        English
        13 months ago

        You can’t change the credentials if the user owns the private key. But nothing stops AI training, that’s part of the terms of service of some of their products, which operate outside the realm of this more open initiative.

        • JustEnoughDucks
          link
          fedilink
          13 months ago

          Spoken like a real Adobe rep lol.

          It’s called a backdoor for a reason. Also since adobe software nowadays has almost full access to your machine, what is to stop adobe from simply uploading and storing your private key on their servers and using it when they like? They run their DRM client with a ton of rights to your computer on boot.

          WhatsApp can do exactly the same thing and read every message you write and still claim it is “end to end encrypted” for example because key creation is through a process in their proprietary software.

          • fmstrat
            link
            fedilink
            English
            13 months ago

            Not sure why you’d say that, its just a factual statement. Also, I don’t even use Adobe products, and transitioned to GIMP and Shotcut many, many years ago. I work in privacy and data security, so I just happen to be involved with this initiative from the sidelines.

            As for your conmetary, you could say the same thing about Signal. But you wouldn’t, because you like them. Just because you don’t like a company doesn’t mean they are being nefarious.

            Would I rather a privacy-focused company be doing this? Yes.

            Am I pleased with what I see from Adobe (a weekly working group full of identity and open source community members)? Yes.

            Does Adobe have a good chance of making this mainstream because of their ecosystem? Also yes.

            When you see something better, let me know and I’ll participate there too, vs complaining about those trying.

            • JustEnoughDucks
              link
              fedilink
              1
              edit-2
              3 months ago

              https://community.signalusers.org/t/overview-of-third-party-security-audits/13243

              Here is an entire list of years and years of independent audits

              https://github.com/signalapp

              Here, go look yourself to verify that the frontend isn’t sending your encryption key back to the server.

              https://www.adobe.com/trust/security.html

              Please tell me where I can find the source code of Adobe’s creative cloud DRM that has full access to the computer it is installed on and their audits to verify that they aren’t sending my private keys back.

              You are comparing an audited, open source program with closed down proprietary system that says “trust me bro, we work with ‘security partners’, no we won’t release the audits”.

              Interesting comparison. It’s like comparing a local farming co-op to the agro-industrial complex of Monsanto/beyer and saying “you could say the same about either! Monsanto is at least innovating in the seed space, no no no, ignore how they use it!!”

              • fmstrat
                link
                fedilink
                English
                13 months ago

                You’re taking that out of context. Signal is open source, but you don’t get to see what happens between GitHub and the Play Store. Adobe’s system that I am aluding to is also open, but we don’t get to see what happens in the software itself. The problem is, that’s not even what I’m talking about. I’m talking about a standard they are developing, not their software or DRM.

                This isn’t just for Adobe, they’re just starting the process. Other systems can run it. Hardware can run it. Do you not use linux because Canonical or Red Hat contributed? Do you steer developers away from flutter because Google started it? Where is the line? Who do you think kicks off all the standards you use today? OAuth, OIDC, etc. If you want to avoid everything these companies contribe to, you’re going to have to stop using the internet.

    • @[email protected]
      link
      fedilink
      193 months ago

      Oh I’m sure Adobe has the greatest of intentions on this. Such a reputable company that has a stellar past.

      I’m sure they won’t gatekeep this digital human signature in some atrocious proprietary standard along with an expensive subscription to have the honor of using it.

      Don’t listen to Adobe on AI or even better don’t accept any “idea” or solution from Adobe.

      • @[email protected]
        link
        fedilink
        2
        edit-2
        3 months ago

        Yeah pretty much.

        I recall flash, and how they absolutely controlled it. I loved flash as a young programmer too.

        But in retrospect, forcing users to go through adobe to use something, with no alternatives? What a nightmare for a Open Internet.

    • @[email protected]
      link
      fedilink
      English
      73 months ago

      How would that work then, I presume most would just ignore it because if it only verifies you used Adobe to make something it’s pretty worthless as a “this isn’t AI” mark.

      • fmstrat
        link
        fedilink
        English
        1
        edit-2
        3 months ago

        It uses cryptographic signatures in the cameras and tools. Say you take a photo with a compatible camera, it gets a signature. Then you retouch in Photoshop, it gets a another signature. And this continues through however many layers. The signature is in the file’s EXIF data, so it can be read on the web. Meaning a photo on a news site could be labeled as authentic, retouched, etc.

        Edit: Doesn’t require Adobe tools. Adobe runs the services, but the method is open. There are cameras on the market today that do this when you take a picture. I beleive someone could add it to GIMP if they desired.

  • @[email protected]
    link
    fedilink
    113 months ago

    Text, sure. But I don’t get the hate towards AI generated images. If it’s a good image and it’s not meant to mislead, I am completely fine with AI content. It’s not slop if it’s good.

    • @[email protected]
      link
      fedilink
      English
      103 months ago

      It’s still stolen content. Regardless of any other issues, it’s 100% stolen content.

        • @[email protected]
          link
          fedilink
          English
          63 months ago

          Yes, just because you disagree that your new toy is literally theft and is one of the most irresponsible inventions since leaded gasoline, that doesn’t change anything.

          Sorry you’re the type of person that added lead shot to your gas tank after they banned leaded gasoline.

          • arglebargle
            link
            fedilink
            English
            5
            edit-2
            3 months ago

            Sorry you’re the type of person that added lead shot to your gas tank after they banned leaded gasoline.

            Well that devolved quickly. People with attitudes like yours make other people really not give a shit what your argument is. Also makes me know you can’t or won’t understand that I don’t really care what happens to AI, and that since there is no data taken it cannot be stolen. But you cant understand that I guess, and we have the same tired arguments.

            At least I am some what happy that the corporate control is getting taken down by open source, that models are being jail broken or freed, and that people are realizing the what we have are only LLM’s and generative noise algo’s: not AI.

        • @[email protected]
          link
          fedilink
          English
          63 months ago

          There’s a pretty clear difference in the two. If piracy ended in a new digital good that removes the market for the original good while eliminating the jobs of those that made the original good, then it’d be close. Even then pretty much everyone agrees not all piracy is the same; you wouldn’t pirate an indie game that hasn’t sold well unless you’re an absolute piece of subhuman shit.

          • Pika
            link
            fedilink
            English
            2
            edit-2
            3 months ago

            well uh, idk how to break it to you but it kinda does.

            Piracy doesn’t equal a 1:1 sale, that argument is true, however that argument works with both AI and piracy plus it goes both ways.

            The more people who do it via the free method, the less people who /may/ have bought it via the paid method. Meaning the less profit/earnings for the affected party.

            However, since it goes both ways, obtaining the item via the free method does not mean that they would have purchased the paid good if the free good wasn’t available.

            Both versions the original market is still available, regardless of method used.

            I highly disagree that piracy and AI are any different at least in the scenario you provided.

            if anything AI would be a morally higher ground imo, as it isn’t directly taking a product, it’s making something else using other products.

            Being said I believe that CC’s should be paid for the training usage, but that’s a whole different argument.

            • @[email protected]
              link
              fedilink
              English
              13 months ago

              It’s not solely about pay, but also what your work is used for. It makes sense you don’t understand this if you’ve never created anything, artwise or otherwise. If I draw a picture I control who displays that picture and for what purpose. If someone I don’t like uses that picture without permission it reflects poorly on me, and destroys my rights.

              The easy example is an art piece by a Holocaust survivor being used by a neonazi without permission.

              Now imagine you steal tens of millions of artists work. You know for a fact you don’t have the licenses needed to ensure their work is used to their liking.

              • Pika
                link
                fedilink
                English
                13 months ago

                I don’t make art myself, the closest I come is software development, which is already heavily scraped and used for training AI models. So, I agree that I might not fully understand, especially since my field tends to embrace assistive tools.

                That said, I think the idea that AI-generated art reflects poorly on the original artist is a bit of a misnomer/self inflicted. When someone looks at an AI-generated piece, they’re not going to think, “Oh, that was by Liyunxiao,” because the end product isn’t a direct copy of any specific work. The models don’t store or reproduce the original source data, they learn patterns based off the source material, and then reapply them using what they have learned, often with a lot of randomization(as shown by it’s sometimes blatant inability to show realistic looking outputs)

                While I believe we agree with the statement that work should have the artists permission before usage in a training model, or at the very least be paid for their usage instead of it just being scraped, I think both are comparable. One makes a new piece of art using what its “learned” off traits the training set had, one copies an existing piece of art. Neither prevent anyone from using the original source(artist or game studio), and they both are done usually against the wishes of the original team.

                Being said, the example provided I think works better when compared to piracy, as at least at that point it’s a 1:1 clone instead of a creative works. As a art piece by a holocaust survivor being thrown into a training set on a diffusion model, wouldn’t come out the same image on the other end. Only a generalization and styleset is saved. At the end of the day, nobody has the ability to know where the diffusion art’s original sources came from nor is it able to produce a picture that is recognizable to an artists style, whereas with piracy you have a piece of work you can look up to see who owned it.

                That’s just my opinion on it all though.

          • @[email protected]
            link
            fedilink
            23 months ago

            I really enjoyed the “Hobbit: Extended Edition” project which condensed the three films of the Hobbit trilogy down into a single film, and as an unofficial fan-made project, is only available online for free.

            Under that proposed gradient, I’m not sure where that would fall, given that it is a transformative work which uses the work of others to make them redundant (in this case, the original trilogy and the studios which would have otherwise profited from those sales).

            I feel like there’s a better way to divide it, but it will be difficult to negotiate the exact line against the long-held contradictory ideas that art should both be divorced from its creator once released but also that the creator is entitled to full control and profit until the expiry of its copyright.

      • @[email protected]
        link
        fedilink
        63 months ago

        I am torn on that. If it’s a company making money off of it, despicable. If it’s an open source model used for memes? I’m fine with that. We shouldn’t act like artists follow some magical calling from god. Anything anyone creates is built on their education and the media they were exposed to. I don’t think generative models are any different.

        • @[email protected]
          link
          fedilink
          English
          23 months ago

          Normalizing is a thing, on top of that there are still indie markets that can be supplanted by gan image generation. On top of that artists still have rights to their work, if they didn’t explicitly license their works for the model, it’s theft that removes the value of the original.

    • ekZeppOP
      link
      fedilink
      English
      3
      edit-2
      3 months ago

      All opinions are always shit to someone, somewhere 👍 💩

    • @[email protected]
      link
      fedilink
      English
      23 months ago

      the default comment under any post here is getting mad at OPs when the sidebar literally says “everything and anything goes”