Thanks ahead of time for your feedback

  • I Cast Fist
    link
    fedilink
    1111 months ago

    Doctored photos have always been a problem and, legally speaking, could lead to the faker being sued for defamation, depending on what was done with the person’s image.

    AI Photos are only part of the problem. Faking the voice is also possible, as is making “good enough” videos where you just change the head of the actual performer.

    Another part of the problem is that this kind of stuff spreads like wildfire within groups (and it’s ALWAYS groups where the victim is) and any voices stating that it’s fake will be drowned by everyone else.

  • @[email protected]
    link
    fedilink
    3511 months ago

    Because previously if someone had the skills to get rich off the skill making convincing fake nudes we could arrest and punish them - people with similar skillsets would usually prefer more legitimate work.

    Now some ass in his basement can crank them out and it’s a futile game of whack-a-mole to kill them dead.

    • magnetosphere
      link
      fedilink
      411 months ago

      If it’s a game of whack-your-hog, however, deepfakes aren’t futile at all.

    • @[email protected]
      link
      fedilink
      English
      17
      edit-2
      11 months ago

      it’s a futile game of whack-a-mole

      It’s still going to be futile even with this law in place. Society is going to have to get used to the fact that photo-realistic images aren’t evidence of anything (especially since the technology will keep improving).

      • @[email protected]
        link
        fedilink
        1311 months ago

        It blows my mind when I think about where we might be headed with this tech. We’ve gotten SO used to the ability to communicate instantly with people far away in the technology age, how will we adapt when we have to go back 300 years and can only trust something someone tells us in person. Will we go back to local newspapers? Or can we not even trust that? Will we have public amphitheaters in busy parts of town, where people will around the news? And we can only trust these people, who have a direct chain of acquaintance all the way back to the source of the information? That seems extreme, but I dunno.

        I think most likely we won’t implement extreme measures like that, to ensure we’re still getting genuine information. I think most likely we’ll just slip into completely generated false news from every source, no longer have any idea what’s really going on, but be convinced this AI thing was overblown, and have no idea we’re being controlled.

        • @[email protected]
          link
          fedilink
          English
          811 months ago

          I don’t think it will be quite that bad. Society worked before photography was invented and now we have cryptographic ways to make sure you’re really talking to the person you think you’re talking to.

          • @[email protected]
            link
            fedilink
            311 months ago

            Truuue, I hadn’t thought of that. Okay, at least it won’t be as bad as I feared.

            Now I just have to sell all these vintage printing presses I bought…

  • Snot Flickerman
    link
    fedilink
    English
    22
    edit-2
    11 months ago

    It was a big deal back then, too, but a lot harder to police, and a lot more obvious that they were fakes.

    Gillian Anderson fakes were real fuckin popular during the time the X-Files were on the air.

    EDIT: Searching for women from the time talking about the phenomenon in the 90’s is difficult because it mostly turns up… troves of fake nudes of these women. Of course.

    • Don_DickleOP
      link
      fedilink
      511 months ago

      Did any women or men fight back from having nudes of them on the new like Swift did?

        • Don_DickleOP
          link
          fedilink
          311 months ago

          Did she ever unleash her wrath like the article says…Maybe the nerd in me but never wanted to see her naked just want to see her in a Princessesque Liae outfit. IYKYK

      • Snot Flickerman
        link
        fedilink
        English
        13
        edit-2
        11 months ago

        I recall women heavily disliking it back then, but I also recall that people in general viewed the internet as just full of weirdos and creeps. Internet wasn’t mainstream, by any stretch of the imagination, so I think it likely “got swept under the rug” because of a general feeling of “who cares what weirdos do online? We’re real people and we never use the internet because we have lives.

        Also, fewer lawyers understood the tech at the time, or how to figure out who was producing these images, and how to prosecute them. So I’d wager that part of going after them was held back by tech-unsavvy lawyers who were like “What’s happening where and how? Dowhatnow? Can you FAX it to me?”

  • BarqsHasBite
    link
    fedilink
    10
    edit-2
    11 months ago

    AI is much better. Photoshop was always a little off with size, angle, lighting, etc. Very easy to spot fakes.

    • Don_DickleOP
      link
      fedilink
      311 months ago

      Not that I watch the morons but how come it seems that the Kardashians are so fond of it?

        • Don_DickleOP
          link
          fedilink
          311 months ago

          Neither do I just that I see headlines of like photoshop fails when the publish pictures and it is really obvious.

  • @[email protected]
    link
    fedilink
    English
    611 months ago

    In addition the the reduced skill barriers mentioned, the other side effect is the reduced time spent finding a matching photo and actually doing the work. Anyone can create it in their spare time, quickly and easily.

  • @[email protected]
    link
    fedilink
    2211 months ago

    It’s always been a big deal, it just died down as Photoshop as a tool became normalized and people became accustomed to it as a tool.

  • HubertManne
    link
    fedilink
    311 months ago

    I sorta feel this way. Before that people would make cutout mashups or artistic types might depict something. I do get that its getting so real folks may things the people actually did the thing.

  • @[email protected]
    link
    fedilink
    English
    64
    edit-2
    11 months ago

    Honestly? It was kind of shitty back then and is just as shitty nowadays.

    I mean, I get why people do it. But in my honest opinion, it’s still a blatant violation of that person’s dignity, at least if it’s distributed.

    • @[email protected]
      link
      fedilink
      English
      2311 months ago

      It’s not that now it’s bad… it’s that now it’s actually being addressed. Whereas before it was just something people would sweep under the rug as being distasteful, but not worthy of attention.

      • @[email protected]
        link
        fedilink
        1011 months ago

        It was easier to ignore when teenagers couldn’t produce convincing images of their classmates with about 5min of research and a mediocre piece of software, then plaster it all over their friend groups or - god forbid - online forums/social media.

  • @[email protected]
    link
    fedilink
    9
    edit-2
    11 months ago

    I got a few comments pointing this out. But media is hell bent on convincing people to hate AI tools and advancements. Why? I don’t know.

    Tin foil hate is that it can be an equalizer. Powerful people that own media like to keep power tools to themselves and want the regular folk to fear and regulate ourselves from using it.

    Like could you imagine if common folk rode dragons in GOT. Absolutely disgusting. People need to fear them and only certain people can use it.

    Same idea. If you’re skeptical, go look up all the headlines about AI in the past year and compare them to right wing media’s headlines about immigration. They’re practically identical.

    “Think of the women and children.”

    “They’re TAKING OUR JOBS”

    “Lot of turds showing up on beaches lately”

    “What if they kill us”

    “THEY’RE STEALING OUR RESOURCES”

    • @[email protected]
      link
      fedilink
      211 months ago

      You’re looking for a cat’s fifth leg. There is no conspiracy. It’s just new technology and what’s new is scary, specially big leaps, which this new age of machine learning seems to be part of.

  • Tarquinn2049
    link
    fedilink
    4811 months ago

    It’s a bit of a blend of it has always been a big deal, and that it is indeed more of a big deal still now because of how easy, accessible, and believable the AI can be. Like even nowadays, Photoshop hits only one point of that triangle. But it was even less capable back in the day. It could hit half of one of those points at any given time.

    Basically, a nude generated by a good AI has to be proven false. Because it doesn’t always immediately seem as such at first. If you have seen obvious AI fakes, they are just that, obvious. There are many non-obvious ones that you might have seen and not known they were fake. That is, of course, assuming you have looked.

    The other reason it can be more of a big deal now is that kids have been doing it of other kids. And since the results can be believable, the parents didn’t know they were fake to start with. So it would blow up as if it was real before finding out it was AI. And anything involving that is gonna be a big deal.

      • Tarquinn2049
        link
        fedilink
        711 months ago

        I mean, that was an issue in the first month or so. Though I could see if the automated tools people use for this specific purpose might not stay up to date. I haven’t specifically interacted with those. But proper AI tools have in-filling to correct mistakes like that, you can keep the rest of the image and just “reroll” a section of it until whatever you didn’t like about it is fixed. Super quick and easy.

  • shastaxc
    link
    fedilink
    2811 months ago

    If AI is so convincing, why would anyone care about nudes being controversial anymore? You can just assume it’s always fake. If everything is fake, why would anyone care?

    • all-knight-party
      link
      fedilink
      911 months ago

      Specifically because it’s convincing. You may just assume everything is fake, that doesn’t mean everyone will. You may not care about seeing someone’s imperceptibly realistic nude, but if it’s depicting them they may care, and they deserve the right for people not to see them like that.

      Just because it’s not logistically feasible to prevent convincing AI nudes from spreading around doesn’t make it ethical

    • @[email protected]
      link
      fedilink
      11
      edit-2
      11 months ago

      You’re right. I’m going to go make convincing images of your partner/sibling/parents/kids/etc. and just share them here since no one should care.

      In fact, I’ll share them on other sites as well.

      • shastaxc
        link
        fedilink
        111 months ago

        That is where we are not seeing this the same way. You wanna make fake images, knock yourself out. I don’t care who they are. Make some of me for all I care.

        • @[email protected]
          link
          fedilink
          1
          edit-2
          11 months ago

          So if this happened to your significant other or children, and they are clearly upset and want it to stop, you would just go “I don’t care?”

          We all sometimes dig in our heels and become absolutist in our arguments online, but dude…do I really need to explain how insane that sounds? Do you really just have a complete lack of imagination and/or empathy?

          What happens when someone makes a convincing fake of you having sex with a kid and spreads it around “for the lulz”?

        • @[email protected]
          link
          fedilink
          4
          edit-2
          11 months ago

          I should’ve clarified that this was meant to be tongue in cheek but also illustrative of the issue. Something being “fake” doesn’t make it any less real to a victim. I understand it was flippant and a bit aggressive, but the intention was to get them to consider the impact it could have on them personally and those they love. It’s a very serious problem and I just struggle to see how people can shrug it off, which admittedly isn’t something I should take out on folks here.

          Hopefully the point still comes across

  • @[email protected]
    link
    fedilink
    2811 months ago

    I have a similar opininipn. People have been forging!/editing photographs and movies for as long as the technique existed.

    Now any stupid kid can do it, the hard part with AI is actually not getting porn. If it can teach everyone that fake photo are a thing, and make nudes worthless (what’s the point of a nude anyway ? Genitals looks like… Genitals)

            • @[email protected]
              link
              fedilink
              English
              1
              edit-2
              11 months ago

              People who are thinking of dating, or who are dating have been known to send nudes to each other.

              This serves several purposes, it indicates interest, it’s meant to entice interest, it’s a sign that things are going well, sets the mood sets the tone etc it is a form of sexual communication which is necessary, and not unheard of, when people are dating

              • @[email protected]
                link
                fedilink
                211 months ago

                Oh, I thought you were talking about where fake nudes matter. I didn’t think we were talking about “legitimate” uses of nudes, because this whole thread is about fakes :D

                • @[email protected]
                  link
                  fedilink
                  English
                  3
                  edit-2
                  11 months ago

                  It’s not necessary.

                  This whole thread was in response to “What is the point of a nude anyway”

    • @[email protected]
      link
      fedilink
      8
      edit-2
      11 months ago

      Imagine this happening to your kid(s) or SO. Imagine this happening to you as a hormonal, confused 15 year old.

      Would you tell them “this doesn’t really matter”?

      Kids have killed themselves because they feel ostracized by social media, the act of just not being included or having a “great life” like they think they see everyone else having. You think they’re simply going to adapt to a world where their classmates (and complete strangers) can torture them with convincing nude images?

      This is the bullying equivalent of a nuclear weapon IMO

  • @[email protected]
    link
    fedilink
    English
    15
    edit-2
    11 months ago

    How do you prove it’s not you in either case? Photoshop doesn’t make a whole video of you fucking a sheep. But AI can and is actively being used that way. With Photoshop it was a matter of getting ahold of the file and inspecting it. Even the best Photoshop jobs have some key tells. Artifacting, layering, all kinds of shading and lighting, how big the file is, etc.

    I want to add something. What if all the sudden it’s your 12 year old daughter being portrayed in this fake? What if it’s your mom? It would have been a big deal to you to have that image out there of your loved one back in the 90’s or early 2000’s. It’s the same kind of big deal now but more widespread because it’s so easy now. It’s not okay to just use the image of someone in ways they didn’t consent to. I have a similar issue with facial recognition regardless of the fact that it’s used in public places where I have no control over it.

  • @[email protected]
    link
    fedilink
    711 months ago

    It’s only a big deal because of puritan society like in the US or UK or similar in the first place. There are societies where nudity is not a big deal anyway, so nude photos of someone are also no big deal.

    Look at Germany for example. Lots of FKK (nude) areas. No one really bats an eye. Of course there nudity is also not perfectly normalized, especially in some circles. But still, not many are concerned about nude pictures of themselves.

    Obviously AI makes more nude pictures faster than someone skilled at Photoshop. So if your society has a problem with nudity, there will be more problems than before.

    But really, there shouldn’t be a problem anyway.

    • @[email protected]
      link
      fedilink
      Deutsch
      1311 months ago

      Look at Germany for example. Lots of FKK (nude) areas. No one really bats an eye.

      We still don’t appreciate our nudes posted online, fake or not.

      • @[email protected]
        link
        fedilink
        8
        edit-2
        11 months ago

        Of course there nudity is also not perfectly normalized, especially in some circles.

        Also obviously because of privacy reasons people don’t like their pictures posted online, nude or not.

    • @[email protected]
      link
      fedilink
      511 months ago

      They’re not even actual nudes - they’re fake. It seems to me to be no different than putting someone’s head on a Big Bird photo.

      That said, nobody gets to decide what’s offensive to other people. Just do as they ask and don’t do the fake nudes. It’s not like there’s a shortage of porn done by willing adults.

      • @[email protected]
        link
        fedilink
        2
        edit-2
        11 months ago

        I’m not saying people should do it. I’m just talking about a fundamental principle to keep yourself happy: to not be hurt by what other people are doing. In the end, you can’t control what other people will do. But you can control your reaction to what they do.

        Of course, everyone is also allowed to disagree with this advice. I’m just sharing what works for me. If someone wants to feel bad/offended about someone making a fake nude of them, I don’t want to stop anyone doing that (and I can’t).