A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

        • @Chakravanti@sh.itjust.works
          link
          fedilink
          1
          edit-2
          2 years ago

          According to what logic? Like I’m ever going to trust some lying asshole to hide his instructions for fucking anything that’s MINE. News Alert: “Your” computer ain’t yours.

          • @Olgratin_Magmatoe@startrek.website
            link
            fedilink
            1
            edit-2
            2 years ago

            People have been trying to circumvent chatGPT’s filters, they’ll do the exact same with open source AI. But it’ll be worse because it’s open source, so any built in feature to prevent abuse could just get removed then recompiled by whoever.

            And that’s all even assuming there ever ends up being open source AI.

            • @Chakravanti@sh.itjust.works
              link
              fedilink
              12 years ago

              You’re logic is bass ackwards. Knowing the open source publicly means the shit gets fixed faster. Closed source just don’t get fixed %99 of the time because there’s only one mother fucker to do the fixing and usually just don’t do it.

              • You can’t fix it with open source. All it takes is one guy making a fork and removing the safeguards because they believe in free speech or something. You can’t have safeguards against misuse of a tool in an open source environment.

                I agree that closed source AI is bad. But open source doesn’t magically solve the problem.

                • @Chakravanti@sh.itjust.works
                  link
                  fedilink
                  12 years ago

                  Forks are productive. Your’re just wrong about it. I’ll take FOSS over closed source. I’ll trust the masses reviewing FOSS over the one asshole doing, or rather not doing, exactly that.

  • Sybil
    link
    fedilink
    872 years ago

    I don’t know what a reasonable"protection" looks like here: the only thing foresee is 14 year old boys getting felonies, but no one being protected.

      • Sybil
        link
        fedilink
        22 years ago

        no. the article mentions"protecting" people several times. I don’t see how anyone is protected by the proposed laws.

    • @ahornsirup@sopuli.xyz
      link
      fedilink
      142 years ago

      Even if you don’t want to consider it CSAM, it is, at the very least, sexual harassment. The kids making and circulating these pictures and videos should be facing consequences. And the fear of consequences does offer some degree of protection at least.

        • @ahornsirup@sopuli.xyz
          link
          fedilink
          12 years ago

          If they distribute the drawing, yes. And the difference is that a drawing is immediately recognisable as a drawing, but an AI generated image or video isn’t necessarily easily recognisable as not being real, so the social consequences for the person depicted can be much worse.

      • @pohart@programming.dev
        link
        fedilink
        32 years ago

        It looks like pretty severe sexual harassment at best. Unfortunately the people I think are most likely to do it are teenagers with poor self control who don’t realize the severity.

        I think if schools can implement appropriate restorative responses and education on the harm done that could be much more effective than decaigan punishments after the fact.

    • @Sharkwellington@lemmy.one
      link
      fedilink
      442 years ago

      Right, there are plenty of reactive measures available but the only proactive measures are either restricting availability of the source photos used or restricting use of the deep fake tools used. Everything beyond that is trying to put the genie back in the bottle.

      • @MagicShel@programming.dev
        link
        fedilink
        262 years ago

        It’s not possible to restrict deep fake technology at this point. It’s out there. Accessible to everyone who wants it and has a computer at home.

      • cannache
        link
        fedilink
        12 years ago

        Are we seriously going to try and use someone’s photos for dumb shit like this? Cone on, people just want something to wank to or someone to call over to have sex with, who the hell would actually do this?

        • @CommanderCloon@lemmy.ml
          link
          fedilink
          6
          edit-2
          2 years ago

          Well evidently the answer to your last question is " some people". Your point would only make sense if all this was hypothetical

      • @interceder270@lemmy.world
        link
        fedilink
        442 years ago

        At some point, communities and social circles need to be able to moderate themselves.

        Disseminating nudes of peers should be grounds for ostracizing, but it really depends on the quality of people around you.

        • @MotoAsh@lemmy.world
          cake
          link
          fedilink
          182 years ago

          That doesn’t work. It’s nothing but an inconvenience to not talk to your neighbors or those around you. They’d just get even worse and make even worse friends online.

          Ostracization doesn’t work. Ever. Period. If they’re bad enough, banishment works. Ostracization is just literally ignoring the problem.

          • @interceder270@lemmy.world
            link
            fedilink
            22 years ago

            Ostracization doesn’t work. Ever. Period. If they’re bad enough, banishment works. Ostracization is just literally ignoring the problem.

            That’s just wrong. Unless you’re hanging around shitty people, ignoring the bad ones by definition works.

            • @SuddenDownpour@sh.itjust.works
              link
              fedilink
              English
              32 years ago

              A lot of social circles are dominated by either shitty people or by people too insecure to take a confronting attitude towards those shitty people.

        • @Sharkwellington@lemmy.one
          link
          fedilink
          32 years ago

          And that’s the point I was making, nobody can be “protected” from widely available photos being used on widely available programs. Best we can do is deter but that isn’t a guarantee.

  • @ZombiFrancis@sh.itjust.works
    link
    fedilink
    262 years ago

    In previous generations the kid making fake porn of their classmates was not a well liked kid. Is that reversed now? On the basis of quality of tech?

    • cannache
      link
      fedilink
      12 years ago

      Oooh that’s bad. Yeah I would never do that but I did hear about the idea floating around back in the day, though I don’t think the tech is there yet. It’s just generally not cool

        • cannache
          link
          fedilink
          12 years ago

          Yeah it sucks bro, but honestly I feel like it just means more people can just chat about porno and have a laugh, and honestly be coy, rather than play

    • Omega
      link
      fedilink
      132 years ago

      That kid that doodles is creepy. But deep fakes probably feel a lot closer to actual nudes.

  • Marxism-Fennekinism
    link
    fedilink
    English
    37
    edit-2
    2 years ago

    Maybe I’m just naive of how many protections we’re actually granted but shouldn’t this already fall under CP/CSAM legislation in nearly every country?

        • @legios@aussie.zone
          link
          fedilink
          English
          62 years ago

          Australia too. Hentai showing underage people is illegal here. From my understanding it’s all a little grey depending on the state and whether the laws are enforced, but if it’s about victimisation the law will be pretty clear.

          • @Fal@yiffit.net
            link
            fedilink
            English
            152 years ago

            Absolutely absurd. Criminalizing drawings is the stupidest thing in the world.

            This case should already be illegal under harassment or similar laws. There’s no reason to make drawings illegal

            • @Metz@lemmy.world
              link
              fedilink
              English
              52 years ago

              In germany even a written story about it is illegal. it is considered “textual CSAM” then.

          • @DogMuffins@discuss.tchncs.de
            link
            fedilink
            English
            12 years ago

            Of course they exist. If the AI generated image “depicts” a person, a victim in this case, that person “by definition” exists.

            Your argument evaporates when you consider that all digital images are interpreted and encoded by complex mathematical algorithms. All digital images are “fake” by that definition and therefore the people depicted do not exist. Try explaining that to your 9 year old daughter.

          • @drislands@lemmy.world
            link
            fedilink
            52 years ago

            The article is about real children being used as the basis for AI-generated porn. This isn’t about entirely fabricated images.

          • @Nyanix@lemmy.ca
            link
            fedilink
            62 years ago

            While you’re correct, many of these generators are retaining the source image and only generating masked sections, so the person in the image is still themselves with effectively photoshopped nudity, which would still qualify as child pornography. That is an interesting point that you make though

        • @rchive@lemm.ee
          link
          fedilink
          12 years ago

          If you make a picture today of someone based on how they looked 10 years ago, we say it’s depicting that person as the age they were 10 years ago. How is what age they are today relevant?

        • @rchive@lemm.ee
          link
          fedilink
          22 years ago

          If you make a picture today of someone based on how they looked 10 years ago, we say it’s depicting that person as the age they were 10 years ago. How is what age they are today relevant?

          • @GeneralVincent@lemmy.world
            link
            fedilink
            English
            52 years ago

            I’m unsure of the point you’re trying to make?

            It’s relevant in this case because the age they are today is underage. A picture of them 10 years ago is underage. And a picture of anyone made by AI to deep fake them nude is unethical irregardless of age. But it’s especially concerning when the goal is to depict underage girls as nude. The age thing specifically could get a little complicated in certain situations ig, but the intent is obvious most of the time.

            • @rchive@lemm.ee
              link
              fedilink
              12 years ago

              I’m obviously not advocating or defending any particular behavior.

              Legally speaking, why is what age they are today relevant rather than the age they are depicted as in the picture? Like, imagine we have a picture 20 years from now of someone at age 37. It’s legally fine until it’s revealed it was generated in 2023 when the person in question was 17? If the exact same picture was generated a year later it’s fine again?

              • @DogMuffins@discuss.tchncs.de
                link
                fedilink
                English
                22 years ago

                Basically, yes.

                Is the person under-age at the time the image was generated? and … Is the image sexual in nature?

                If yes, then generating or possessing such an image ought to be a crime.

      • @Fal@yiffit.net
        link
        fedilink
        English
        132 years ago

        Won’t somebody think of the make believe computer generated cartoon children?!

  • reading this, I don’t really know what is supposed to be protected here to be deemed possible of protections in the first place.

    closest reasonable one is the girl’s “identity”, so it could be fraud. but it’s not used to fool people. more likely, those getting the pics already consented this is ai generated.

    so might be defamation?

    the image generation tech is already easily accessible so the girl’s picture being easily accessible might be the weakest link?

      • @Fal@yiffit.net
        link
        fedilink
        English
        42 years ago

        Pretty sure it’s illegal to create sexual images of children, photos or not.

        Maybe in your distopian countries where drawings are illegal. Absolutely absurd you’re promoting that as a good thing.

        • Trailblazing Braille Taser
          link
          fedilink
          32 years ago

          I’m not exactly sure what your point is. In the article, a kid created an unwanted sexual depiction of another kid and spread it around. I do think that should be illegal.

          • @Fal@yiffit.net
            link
            fedilink
            English
            12 years ago

            Yes but this thread is about just drawings in general. Deep faking someone into porn and spreading it around should absolutely be yourself. But it’s not “child porn”. It’s some type of harassment or defamation or something

      • DarkGamer
        link
        fedilink
        7
        edit-2
        2 years ago

        Thanks for the valuable contribution to this discussion! It does appear this is a question of identity and personality rights, regarding how one wants to be portrayed.

        Reading that article though, it seems like that only applies to commercial purposes. If one is making deep fakes for their own non-commercial private use, it doesn’t appear personality rights apply.

    • cannache
      link
      fedilink
      12 years ago

      Tbh you’ve got a point despite the downvotes, but there’s a lot of ladies out there that would be very upset with your viewpoint because they still hold onto the Madonna Whore complex

    • @Saganaki@lemmy.one
      link
      fedilink
      6
      edit-2
      2 years ago

      I’m not sure if you’re trolling or not here or just lacking empathy.

      Imagine a believable image, let alone video, of you with your full name and age on it participating in a burning cross ceremony in a white hood propagating through the internet.

      This isn’t just some situation where it stays on a single person’s computer—it gets shared. And is effectively unstoppable.

      I’m not claiming of knowing a way to handle this situation, but your comment is really confusing to me that you don’t understand the harms here.

      • DarkGamer
        link
        fedilink
        32 years ago

        Well this technology is going to make said photos not believable, isn’t it?

        • @Saganaki@lemmy.one
          link
          fedilink
          52 years ago

          So that means it will make all images and videos not believable. That’s serious dystopian shit.

          Trust no one. Everything is fake. Nothing is real.

          • DarkGamer
            link
            fedilink
            4
            edit-2
            2 years ago

            Unfortunately that’s the road we’re headed down, and if there’s an off-ramp I don’t see it. Photo and video evidence alone will not be sufficient to prove claims in the near future.

    • @Dimantina@lemmy.world
      link
      fedilink
      122 years ago

      It’s harmful, especially at that age. Psychologicaly it triggers a sense of violation of person. It’s a sense of privacy being shattered.

      Also not all people experience sexual awakening/understand they are sexual beings, by 14 or even 18. It is confusing/harmful to have that forced upon you.

      Talk to a close female friend or your mother regarding how they feel about the AI deepfakr, and how they’d react in highschool if this happened to them. Really listen to the answer and you’ll gain a better understanding of the harm done.

      • DarkGamer
        link
        fedilink
        4
        edit-2
        2 years ago

        It isn’t forced upon them though, they’re not even involved.

        Privacy has not been shattered because this is not something that happened in private. In fact, these nudes didn’t happen in reality at all. It’s imagined, either via AI or via human.

        Talk to a close female friend or your mother regarding how they feel about the AI deepfakr, and how they’d react in highschool if this happened to them. Really listen to the answer and you’ll gain a better understanding of the harm done.

        Said harm is because of social stigma and shame regarding perceptions of being seen nude, which is what I referred to as being weird. It is a vestige of our puritanical past that we could do without.

        Now, if these girls are being harassed, that’s a different matter, that can happen with or without deep fakes. I’m pretty sure we already have methods of dealing with that.

          • DarkGamer
            link
            fedilink
            2
            edit-2
            2 years ago

            Clearly that’s the only reason why I could possibly disagree? Lol, get bent. I just don’t think we should make kids into criminals for using technology to imagine what their classmates look like naked.

            • @RippleEffect@lemm.ee
              link
              fedilink
              English
              12 years ago

              I think they (the kids) and should face suspensions and expulsion, but legal repurcussions are an entirely different thing when you consider how many mistakes teens make. I don’t think it should be entirely free of legal repurcussions, but would agree that kids are kids.

              It’s always tough when discussing teens because some absolutely know what they’re doing to others and fully intend to be harmful, while others think they’re just performing a prank.

          • DarkGamer
            link
            fedilink
            1
            edit-2
            2 years ago

            If it spreads from peers to them and affects them negatively, it’s arguably harassment, which there are existing methods for dealing with. No different than if it were an offensive doodle or mean gossip, which are also unwanted creations.

            • Flying Squid
              link
              fedilink
              22 years ago

              And what happens in 15 years when an employer finds out that there are images of them doing porn on the internet? How are they going to explain it’s fake when their boss tells them that is the sort of reputation that is harmful to the company?

              • DarkGamer
                link
                fedilink
                12 years ago

                Well, if they are fake I suspect they will say that. If an employer fires them for something they did not do, that’s a huge lawsuit.

                As for proving it, I’m not sure how one does that when this technology matures. Perhaps metadata? Fake porn images have been an issue for some time but usually one can tell if they’ve been doctored, I don’t know if that’s the case with AI deep fakes in the future. Maybe we will need AI to determine if images are AI generated.

                • Flying Squid
                  link
                  fedilink
                  22 years ago

                  Well, if they are fake I suspect they will say that. If an employer fires them for something they did not do, that’s a huge lawsuit.

                  I take it you’ve never been to America before.

        • Snot Flickerman
          link
          fedilink
          English
          8
          edit-2
          2 years ago

          Said harm is because of social stigma and shame regarding perceptions of being seen nude, which is what I referred to as being weird. It is a vestige of our puritanical past that we could do without.

          This has big “if you don’t have anything to hide, you have nothing to fear” energy.

          “Why would you hide behind clothes? Do you have something to hide?”

          Look man, I think our past is puritanical too. However, this is just… I don’t even know how to defend such a skeevy/creepy opinion.

          People do feel violated by such actions, even when they don’t have people harassing them. You can’t lecture us and say they don’t. You’re not the arbiter of how other people feel about things, and feeling violated has nothing to do with prudishness. That’s damage and emotional harm, and you hand-waving it away is pretty fucking gross.

          • DarkGamer
            link
            fedilink
            3
            edit-2
            2 years ago

            If we were talking about someone getting photos of these people nude through their window or similar, I would agree with you, It would be a violation, but that’s not what we’re discussing.

            Feeling violated is not sufficient cause to criminalize this technology. There must be actual harm and I do not believe emotional distress over people looking at facsimiles of a nude photo clears this bar.

            If drawing an illustration of someone nude from imagination is not illegal, neither should this be.

            “Why would you hide behind clothes? Do you have something to hide?”

            AI has no idea what they look like through their clothes, it imagines it based on a data set of other nudes. Deep fakes will never show whatever they want to hide.

    • @UlrikHD@programming.dev
      link
      fedilink
      242 years ago

      Lower skill ceiling. One option can be done by pretty much anyone at a high volume output, the other would require a lot training and are not available for your average basement dweller.

      Good luck trying to regulate it though, Pandora’s box is opened and you won’t be able to stop the FOSS community from working on the tech.

  • @virock@lemmy.world
    link
    fedilink
    182 years ago

    I studied Computer Science so I know that the only way to teach an AI agent to stop drawing naked girls is to… give it pictures of naked girls so it can learn what not to draw :(

    • rustydomino
      link
      fedilink
      English
      32 years ago

      hmmm - I wonder it makes sense to use generative AI to create negative training data for things like CP. That would essentially be a victimless way to train the AIs. Of course, that creates the conundrum of who actually verifies the AI-generated training data…

      • this doesn’t work. AI still needs to know what is CP in order to create CP for negative use. So you need to first feed it with CP. Recent example of how OpenAI was labelling “bad text”

        The premise was simple: feed an AI with labeled examples of violence, hate speech, and sexual abuse, and that tool could learn to detect those forms of toxicity in the wild. That detector would be built into ChatGPT to check whether it was echoing the toxicity of its training data, and filter it out before it ever reached the user. It could also help scrub toxic text from the training datasets of future AI models.

        To get those labels, OpenAI sent tens of thousands of snippets of text to an outsourcing firm in Kenya, beginning in November 2021. Much of that text appeared to have been pulled from the darkest recesses of the internet. Some of it described situations in graphic detail like child sexual abuse, bestiality, murder, suicide, torture, self harm, and incest.

        source: https://time.com/6247678/openai-chatgpt-kenya-workers/

  • @Gork@lemm.ee
    link
    fedilink
    182 years ago

    President Joe Biden signed an executive order in October that, among other things, called for barring the use of generative AI to produce child sexual abuse material or non-consensual “intimate imagery of real individuals.” The order also directs the federal government to issue guidance to label and watermark AI-generated content to help differentiate between authentic and material made by software.

    Step in the right direction, I guess.

    How is the government going to be able to differentiate authentic images/videos from AI generated ones? Some of the AI images are getting super realistic, to the point where it’s difficult for human eyes to tell the difference.

    • @CommanderCloon@lemmy.ml
      link
      fedilink
      52 years ago

      I wouldn’t call this a step in the tight direction. A call for a step yeah, but it’s not actually a step until something is actually done

    • @apex32@lemmy.world
      link
      fedilink
      42 years ago

      That’s a cool quiz, and it’s from 2022. I’m sure AI has improved since then. Would love to see an updated version.

  • @Aceticon@lemmy.world
    link
    fedilink
    39
    edit-2
    2 years ago

    There might be an upside to all this, though maybe not for these girls: with enough of this people will eventually just stop believing any nude pictures “leaked” are real, which will be a great thing for people who had real nude pictures leaked - which, once on the Internet, are pretty hard to stop spreading - because other people will just presume they’re deepfakes.

    Mind you, it would be a lot better if people in general culturally evolved beyond being preachy monkeys who pass judgment on others because they’ve been photographed in their birthday-suit, but that’s clearly asking too much so I guess simply people assuming all such things are deepfakes until proven otherwise is at least better than the status quo.

  • @gandalf_der_12te@feddit.de
    link
    fedilink
    552 years ago

    Honest opinion:

    We should normalize nudity.

    That’s the only healthy relationship that we can have with our bodies in the long term.

  • @calypsopub@lemmy.world
    link
    fedilink
    02 years ago

    So as a grown woman, I’m not getting why teenage girls should give any of this oxygen. Some idiot takes my head and pastes it on porn. So what? That’s more embarrassing for HIM than for me. How pathetic that these incels are so unable to have a relationship with an actual girl. Whatever, dudes. Any boy who does this should be laughed off campus. Girls need to take their power and use it collectively to shame and humiliate these guys.

    I do think anyone who spreads these images should be prosecuted as a child pornographer and listed as a sex offender. Make an example out of a few and the rest won’t dare to share it outside their sick incels club.

    • @WoahWoah@lemmy.world
      link
      fedilink
      02 years ago

      That’s fine and well. Except they are videos, and it is very difficult to prove they aren’t you. And the internet is forever.

      This isn’t like high school when you went to high school.

      Agreed on your last paragraph.

      • Margot Robbie
        link
        fedilink
        02 years ago

        Then nude leak scandals will quickly become a thing of the past, because now every nude video/picture can be assumed to be AI generated and are always fake until proven otherwise.

        That’s the silver lining of this entire ordeal.

        Again, this is a content distribution problem more than an AI problem, the liability should be on those who willingly host these deepfake content than on AI image generators.

        • @finestnothing@lemmy.world
          link
          fedilink
          02 years ago

          That would be great in a perfect world, but unfortunately public perception is significantly more important than facts when it comes to stuff like this. People accused of heinous crimes can and do lose friends, their jobs, and have their life ruined even if they prove that they are completely innocent

          Plus, something I’ve already seen happen is someone says a nude is fake and are then told they have to prove that it’s fake to get people to believe them… which is very hard without sharing an actual nude that has something unique about their body

          • Derpgon
            link
            fedilink
            02 years ago

            The rest of the human body has more unique traits than the nude parts. Freckles, birthmarks, scars, tattoos. Those are traits that are not possible to replicate unless the person specifically knows.

            Now that I think about it, we all proobably need a tattoo. That should clear anyone instantly.

            • @WoahWoah@lemmy.world
              link
              fedilink
              0
              edit-2
              2 years ago

              Yes I’m sure a hiring manager is going to involve themselves that deeply in the pornographic video your face pops up in.

              HR probably wouldn’t even allow a conversation about it. That person just never gets called back.

              And then the worse part is the jobs that DO hire you. Now you have to question why they are hiring you. Did they not see the fake porn video? Or did they see it.

              The entire thing is damaging and ugly.