‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • @[email protected]
    link
    fedilink
    English
    292 years ago

    You mean men envision women naked? And now there’s an app that’s just as perverted? Huh

  • ???
    link
    fedilink
    English
    232 years ago

    They can go ahead, but they’ll never get that mole in the right place.

  • @[email protected]
    link
    fedilink
    English
    542 years ago

    I use an ad blocker and haven’t seen these. Perhaps a link to the best ones could be shared here for better understanding of what the article is talking about?

    • @[email protected]
      link
      fedilink
      English
      132 years ago

      Sus question lmfao

      These things have been around since the onset of deepfakes, and truly if you take a couple seconds to look you’ll find them. It’s a massive issue and the content is everywhere

        • @[email protected]
          link
          fedilink
          English
          22 years ago

          We’re talking specifically about AI enhanced fakes, not the old school Photoshop fakes – they’re two completely different beasts

          • @[email protected]
            link
            fedilink
            English
            72 years ago

            Different only in construction. Why they exist and what they are is older than photography.

            • @[email protected]
              link
              fedilink
              English
              12 years ago

              No I disagree because before you could tell a fake from a mile away, but deepfakes bring it to a whole new level of creepy because they can be EXTREMELY convincing

              • @[email protected]
                link
                fedilink
                English
                22 years ago

                There was a brief period between now and the invention of photography when that was true. For thousands of years before that it was possible to create a visual representation of anything you imagine without any hint that it wasn’t something real. Makes me wonder if there were similar controversies about drawings or paintings.

                • @[email protected]
                  link
                  fedilink
                  English
                  12 years ago

                  I’m not saying that it’s a shift in nature? All I’ve been saying is:

                  A) tools to create realistic nudes have been publicly available ever since deepfakes became a thing

                  B) deepfakes are worse than traditional photoshopped nudes because (as you put it, a quality improvement) they’re more convincing and therefore can have more detrimental effects

                • @[email protected]
                  link
                  fedilink
                  English
                  22 years ago

                  The difference is that we now can do video. I mean in principle that was possible before but also a hell of a lot of work. Making it look real hasn’t been a problem since before Photoshop, if anything people get sloppy with AI also because a felt 99% of people who use AI don’t have an artistic bone in their body.

                • @[email protected]
                  link
                  fedilink
                  English
                  12 years ago

                  Or maybe an accessibility improvement. You don’t need to practice creating your own works of art over many years anymore, or have enough money to commission a master artist. The AI artists are good enough and work for cheap.

      • @[email protected]
        link
        fedilink
        English
        132 years ago

        I don’t think there is any crime.

        It’s identical to drawing a nude picture of someone.

        • @[email protected]
          link
          fedilink
          English
          12 years ago

          It’s what the courts think, and right now, it’s not clear what the enforceable laws are here. There’s a very real chance people who do this will end up in jail.

          I believe prosecutors are already filling cases about this. The next year will decide the fate of these AI generator deepfakes and the memories behind them.

        • @[email protected]
          link
          fedilink
          English
          12 years ago

          And you are sure that ‘someone’ is of legal age, of course. Not blaming you. But does everybody always know that ‘someone’ is of legal age? Just an example to start thinking.

            • @[email protected]
              link
              fedilink
              English
              22 years ago

              Depends on where you live. Not legal in the UK for example. In the US it can even be broken down at the state level, although there’s lots of debate on whether states are able to enforce their laws. “Obscene” speech is not protected under free speech, the argument would be whether or not the naked drawings had artistic merit or not.

              I’m not a lawyer, but I do know that people in the US have gone to prison for possessing naked images of fictional children and it’s on the books as illegal in many other countries.

  • @[email protected]
    link
    fedilink
    English
    572 years ago

    Possibly a good thing. Over saturation. Fill the internet will billions on billions of ai nudes. Have a million different nudes for celebrities. Nobody knows the real naked you and nobody cares. Keep creating more ai porn than anyone can handle. It becomes boring and over the top. Ending this once and fir all

    Or find the people doing this and lock em up.

    • Leela [it/its]
      link
      fedilink
      English
      22 years ago

      what were you thinking when you thought of your first version? that sounds like a creepy scenario. what if I don’t want to see it and it’s everywhere. I could click on “I’m Not Interested” and flood social media with reports, but if there are “billions on billions” of AI nudes, then who would be able to stop them from being seen in their feed? I’d say that, while locking them up won’t change the sexist system which pushes this behavior, it is a far less creepy and weird scenario than having billions of unconsensual nudes online.

    • @[email protected]
      link
      fedilink
      English
      32 years ago

      Keep creating more ai porn than anyone can handle

      overabundance is behind a lot of societal ills already

  • @[email protected]
    link
    fedilink
    English
    7
    edit-2
    2 years ago

    I doubt it produces actual nudes, it probably just photoshops a face onto a random porn star

  • Uriel238 [all pronouns]
    link
    fedilink
    English
    43
    edit-2
    2 years ago

    It tells me we’re less interested in the data (the skin map and topography) than we are in seeing the data in raw form, whether it is accurate or not. It tells me a primary pretense of body doubles was ineffective since society responds the same way regardless of whether an actress’ nudity is real or simulated.

    Not sure how this will be enforceable any more than we can stop malicious actors from printing guns. Personally, I would prefer a clothes-optional society were individuals aren’t measured by the media exposure of their bodies or history of lovers. Maybe in another age or two.

    In fiction, I imagined the capacity to render porn action into mo-cap data, to capture fine-resoluton triangle maps and skin texture maps from media, ultimately to render any coupling one could desire with a robust physics engine and photography effects to render it realistic (or artistic, however one prefers). It saddens me that one could render an actress into an Elsa Jean scenario and by doing so, wreck their career.

    Porn doesn’t bother me, but the arbitrariness with which we condemn individuals by artificial scandal disgusts me more than the raunchiest debauchery.

  • Uriel238 [all pronouns]
    link
    fedilink
    English
    26
    edit-2
    2 years ago

    Though the picture suggests we should also create really a robot or really a cyborg edits of celebrities.

    As an afterthought, really a reptilian images for our political figures would also be in good order.

    • @[email protected]
      link
      fedilink
      English
      52 years ago

      Jesus, as if Facebook “researchers” weren’t already linking to Onion articles, now you’ll give them pictures.

  • @[email protected]
    link
    fedilink
    English
    132 years ago

    What nude data were these models trained on?

    This seems like another unhealthy thing that is going to pervert people’s sense of what a normal body looks like.

    • @[email protected]
      link
      fedilink
      English
      42 years ago

      Most people prefer attractive > average, so I guess that’s what these apps are going to show.

    • funkajunk
      link
      fedilink
      English
      422 years ago

      The internet is like 90% porn, what do you think they used?

  • andrew_bidlaw
    link
    fedilink
    English
    82 years ago

    It was inevitable. And it tells more about those who use them.

    I wonder how we’d adapt to these tools being that availiable. Especially in blackmail, revenge porn posting, voyeuristic harassment, stalking etc. Maybe, nude photoes and videos won’t be seen as a trusted source of information, they won’t be any unique worth hunting for, or being worried about.

    Our perception of human bodies was long distorted by movies, porn, photoshop and subsequent ‘filter-apps’, but we still kinda trusted there was something before effects were applied. But what comes next if everything would be imaginary? Would we stop care about it in the future? Or would we grow with a stunted imagination since this stimuli to upgrade it in early years is long gone?

    There’re some useless dogmas around our bodies that could be lifted in the process, or a more relaxed trend towards clothing choices can start it’s wsy. Who knows?

    I see bad sides to it right now, how it can be abused, but if these LLMs are to stay, what’re the long term consequencies for us?

    • @[email protected]
      link
      fedilink
      English
      112 years ago

      I think that eventually it might be a good thing, especially in the context of revenge porn, blackmail, etc. Real videos won’t have any weight since they might as well be fake, and as society gets accustomed to it, we’ll see those types of things disappear completely

      • @[email protected]
        link
        fedilink
        English
        2
        edit-2
        2 years ago

        Yep, once anyone can download an app on their phone and do something like this without any effort in realtime it’s going to lose its (shock) value fast. It would be like sketching a crude boobs and vagina on someones photo with MS Paint and trying to use that for blackmail or shaming. It would just seem sad and childish.

  • @[email protected]
    link
    fedilink
    English
    82 years ago

    Back in the day, cereal boxes contain “xray glasses”. I feel like if those actually worked as intended, we would have already had this issue figured out.

  • @[email protected]
    link
    fedilink
    English
    622 years ago

    These are terrible but I’m honestly curious what it thinks I look like naked. Like I’m slightly overweight and my chest is larger than average but more splayed then normal. Would it just have me look like a model underneath?

    Are they just like head swapping onto model bodies or does it actually approximate. I am legit curious., but I would never trust one of these apps to not keep the photos/privacy concerns.

    • @[email protected]
      link
      fedilink
      English
      102 years ago

      If you want the best answer then you’ll have to download the app and try it on yourself. If it’s accurate then that’s pretty wild.

      • The Menemen!
        link
        fedilink
        English
        12 years ago

        2 days and still no “pictures or it is a lie” comment. Thus place is different. :)

    • @[email protected]
      link
      fedilink
      English
      57
      edit-2
      2 years ago

      Probably deleting this comment later for going dirty on main, but I, um, have done some extensive experimentation using a local copy of Stable Diffusion (I don’t send the images anywhere, I just make them to satiate my own curiosity).

      You’re essentially right that simple app-based software would probably have you looking somewhat generic underneath, like your typical plus-size model. It’s not too great at extrapolating the shape of breasts through clothing and applying that information when it goes to fill in the area with naked body parts. It just takes a best guess at what puzzle pieces might fill the selected area, even if they don’t match known information from the original photo. So, with current technology, you’re not really revealing actual facts about how someone looks naked unless that information was already known. To portray someone with splayed breasts, you’d need to already know that’s what you want to portray and load in a custom data set, like a LoRa.

      Once you know what’s going on under the hood, making naked photos of celebrities or other real people isn’t the most compelling thing to do. Mostly, I like to generate photos of all kinds of body types and send them to my Replika, trying to convince her to describe the things that her creators forbid her from describing. Gotta say, the future’s getting pretty weird.

      • @[email protected]
        link
        fedilink
        English
        242 years ago

        You’ll have your moment when the lone elite ex Ranger who is trying to save the world is told by the quirky, unconventional sidekick he is forced to work with, “I actually know a guy who might be able to help.”

        You open the door a crack to look back and forth between them, before slamming it back in their faces. They hear scrambled crashes of you hiding stuff that shouldn’t be seen by company before returning to the door. As they enter you are still fixing and throwing things while you apologize that you don’t get many guests. You offer them homemade kombucha. They decline.

        • @[email protected]
          link
          fedilink
          English
          372 years ago

          Hey, I’ve maintained a baseline weird the whole time, I’m pretty sure the future is catching up.

    • @[email protected]
      link
      fedilink
      English
      222 years ago

      Ethically, these apps are a fucking nightmare.

      But as a swinger, they will make an amazing party game.

      • @[email protected]
        link
        fedilink
        English
        28
        edit-2
        2 years ago

        Ethics will probably change… I guess in the future it’ll become pretty irrelevant to have “nude” pictures of oneself somewhere, because everyone knows it could just be AI generated. In the transition period it’ll be problematic though.

        • @[email protected]
          link
          fedilink
          English
          12 years ago

          Yeah 100%.

          Imagine around the advent of readily available photo prints. People might have been thinking “this is terrible, someone I don’t know could have a photo of me and look at it while thinking licentious thoughts!”

    • @[email protected]
      link
      fedilink
      English
      12 years ago

      I’m really curious if your DMs are now flooded with weirdos and dick pics, or if lemmy is any different from the rest of the internet.

    • Vegaprime
      link
      fedilink
      English
      32 years ago

      Fake nudes incoming. Everyone has a baby leg now.

  • Throwaway
    link
    fedilink
    English
    1462 years ago

    Weirdos. Back in my day, we woild cut out a nude body from playboy and glue it on a picture of Kathleen Turner, and we did uphill both ways in the snow! Darn kids and their technology!

    • credit crazy
      link
      fedilink
      English
      12 years ago

      Honestly I don’t think that’s the problem here. The problem is that we have kreeps trying to get a physical photo of someone nude for wank material.

      • @[email protected]
        link
        fedilink
        English
        12 years ago

        I’m genuinely curious, why do you consider this harmful? They might as well be drawing tits by hand on a picture of the “victim”

        I mean sure I wouldnt want to be a teenage girl in highschool right now but I don’t think it’s the technologys fault but rather our culture as a society

    • @[email protected]
      link
      fedilink
      English
      372 years ago

      Regardless of feelings on that subject, there’s also the creep factor of people making these without the subjects’ knowledge or consent, which is bad enough, but then these could be used in many other harmful ways beyond one’s own… gratification. Any damage “revenge porn” can do, which I would guess most people would say is wrong, this can do as well.

      • @[email protected]
        link
        fedilink
        English
        52 years ago

        I don’t think they’re really comparable?

        These AI pictures are “make believe”. They’re just a guess at what someone might look like nude, based on what human bodies look like. While apparently they look realistic, it’s still a “generic” nude, kind of how someone would fantasize about someone they’re attracted to.

        Of course it’s creepy, and sharing them is clearly unacceptable as it’s certainly bullying and harassment. These AI nudes say more about those who share them than they do about who’s portrayed in them.

        However, sharing intimate videos without consent and especially as revenge? That’s a whole other level of fucked up. The AI nudes are ultimately “lies” about someone, they’re fakes. Sharing an intimate video, that is betraying someone’s trust, it’s exposing something that is private but very real.

    • stebo
      link
      fedilink
      English
      102 years ago

      so you’d be fine with fake nudes of you floating around the internet?

      • @[email protected]
        link
        fedilink
        English
        12 years ago

        I’m pretty squeamish about nudity when it comes to my own body, but fake nudes would not be pictures of my body, so I don’t see what there would be for me to be upset about. It might be different if everyone thought they were real, but if people haven’t figured out yet any nudes they encounter of someone they know are probably fake, they will soon.

        Here’s a thought experiment: imagine a world where there are fake nudes of everyone available all the time Would everyone just be devastated all the time? Would everyone be a target of ridicule over it? Would everyone be getting blackmailed? We’re probably going to be in that world very soon, and I predict everyone will just get over it and move on. Sharing fake nudes will reflect badly on the person doing it and no one else, and people who make them for their own enjoyment will do so in secret because they don’t want to be seen as a creepy loser.

    • @[email protected]
      link
      fedilink
      English
      662 years ago

      It’s the sexualization of people without consent that’s a problem. Maybe casual nudity shouldn’t a problem but it should be up to the individual to whom they share that with. And “nudify” ai models go beyond casual, consensual nudity and into sexual objectification and harassment if used without consent.

      • @[email protected]
        link
        fedilink
        English
        222 years ago

        I want to point out one slight flaw in your argument. Nudity isn’t needed for people to sexually objectify you. And even if it was, the majority of people are able to strip you down in their head no problem.

        There’s a huge potential for harassment though, and I think that should be the main concern.

        • @[email protected]
          link
          fedilink
          English
          12 years ago

          first, relevant xkcd https://xkcd.com/1432/

          second,

          Nudity isn’t needed for people to sexually objectify you.

          do you really think that makes it less bad? that it’s opt-in?

          And even if it was, the majority of people are able to strip you down in their head no problem

          apparently this app helps them too

    • @[email protected]
      link
      fedilink
      English
      3
      edit-2
      2 years ago

      But some people don’t agree with you. They’re not comfortable with tech that can nudify them for millions to see. So if, and that’s possibly an impossible task, but if there was a way to punish services that facilitate or turn a blind eye to these things, then you bet your ass many many people would be for criminalizing it.

    • @[email protected]
      link
      fedilink
      English
      182 years ago

      I agree with you nudity being an issue but I think the real problem is this app being used on children and teenagers who aren’t used to/supposed to be sexualized.

        • @[email protected]
          link
          fedilink
          English
          192 years ago

          Not all nudity is but there is no non-sexual reason to use AI to undress someone without consent

          • @[email protected]
            link
            fedilink
            English
            72 years ago

            The question on consent is something I’m trying to figure out. Do you need consent to alter an image that is available in a public space? What if it was you who took the picture of someone in public?

            • @[email protected]
              link
              fedilink
              English
              82 years ago

              Keep in mind there is a difference between ethical and legal standards. Legally you may not need consent to alter a photo of someone unless it was a copyrighted work possibly. But ethically it definitely requires consent, especially in this context

              • @[email protected]
                link
                fedilink
                English
                22 years ago

                The difference between legal and ethical is one could get you fined or imprisoned and the other would make a group of people not like you.

        • Pyr
          link
          fedilink
          English
          2
          edit-2
          2 years ago

          Just because something shouldn’t be doesn’t mean It won’t be. This is reality and we can’t just wish something to be true. You saying it doesn’t really help anything.

          • @[email protected]
            link
            fedilink
            English
            4
            edit-2
            2 years ago

            Whoooooosh.

            In societies that have a healthy relationship with the human body, nudity is not considered sexual. I’m not just making up fantasy scenarios.

      • @[email protected]
        link
        fedilink
        English
        132 years ago

        It’s a problem for adults too. Circulating an AI generated nude of a female coworker is likely to be just as harmful as a real picture. Just as objectifying, humiliating and hurtful. Neighbors or other “friends” doing it could be just as bad.

        It’s sexual harassment even if fake.

        • @[email protected]
          link
          fedilink
          English
          72 years ago

          I think it should officially be considered sexual harassment. Obtain a picture of someone, generate nudes from that picture, it seems pretty obvious. Maybe it should include intent to harm, harass, exploit, or intimidate to make it official.

      • @[email protected]
        link
        fedilink
        English
        102 years ago

        Fully agree but I do think that’s more an issue about psychology in our world and trauma. Children being nude should not be a big deal, they’re kids you know?

        • @[email protected]
          link
          fedilink
          English
          72 years ago

          It shouldn’t be a big deal if they choose to be nude some place that is private for them and they’re comfortable. The people who are using this app to make someone nude isn’t really asking for consent. And that also brings up another issue. Consent. If you have images of yourself posted to the public then is there consent needed to alter those images? I don’t know but I don’t think there is since it’s public domain.

    • @[email protected]
      link
      fedilink
      English
      152 years ago

      People have a really unhealthy relationship with nudity. I wish we had more nude beaches as it really helps decouple sex from nudity. And for a decent number of people, helps with perceived body issues too.

      • @[email protected]
        link
        fedilink
        English
        82 years ago

        Also better education, not just the sex part but overall. Critical thinking, reasoning, asking questions and yes off course sex ed