‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • @[email protected]
    link
    fedilink
    English
    482 years ago

    Honestly, were probably just going to have to get over it. Or pull the plug on the whole ai thing, but good luck with that.

  • @[email protected]
    link
    fedilink
    English
    38
    edit-2
    2 years ago

    That the chat is full of people defending this is disgusting. This is different than cutting someone’s face out of a photo and pasting it on a magazine nude or imagining a person naked. Deepfakes can be difficult to tell apart from real media. This enables the spread of nonconsensual pornography that an arbitrary person cannot necessarily tell is fake. Even if it were easy to tell, it’s an invasion of privacy to use someone’s likeness against their will without their consent for the purposes you’re using it for.

    The fediverse’s high expectations for privacy seem to go right out the window when violating it gets their dick hard. We should be better.

      • @[email protected]
        link
        fedilink
        English
        42 years ago

        So it’s fine to violate someone’s privacy so long as you don’t share it? Weird morals you got there.

        • @[email protected]
          link
          fedilink
          English
          10
          edit-2
          2 years ago

          Am I violating privacy by picturing women naked?

          Because if it’s as cut and dried as you say, then the answer must be yes, and that’s flat out dumb

          I don’t see this as a privacy issue and I’m not sure how you’re squeezing it into that. I am not sure what it is, but you cannot violate a person’s privacy by imagining or coding an image of them. It’s weird creepy and because it can be mistaken for a real image, not proper to share.

          Can you actually stop clutching pearls for a moment to think this through a little better?

          • @[email protected]
            link
            fedilink
            English
            32 years ago

            Sexualizing strangers isn’t a right or moral afforded to you by society. That’s a braindead take. You can ABSOLUTELY violate someone’s privacy by coding an image of them. That’s both a moral and legal question with an answer.

            Your comment is a self report.

            • @[email protected]
              link
              fedilink
              English
              52 years ago

              That’s a braindead take

              Projection. Since you have no room for new thoughts in your head, I consider this a block request.

    • @[email protected]
      link
      fedilink
      English
      22 years ago

      Unfortunately sounds like par for the course for the internet. I’ve come to believe that the internet has its good uses for things like commerce and general information streaming, but by and large it’s bringing out the worst in humanity far more than the best. Or it’s all run by ultra-horny psychopathic teenagers pretending to be adults yet living on a philosophy of “I’m 13 and this is deep” logic.

      • @[email protected]
        link
        fedilink
        English
        22 years ago

        I dunno why I am perpetually surprised about this though. This is such a cut and dry moral area and the people who say it isn’t are so clearly telling on themselves it’s kind of shocking, but I guess it shouldn’t be

        • @[email protected]
          link
          fedilink
          English
          22 years ago

          I think the distinction is that half of the thread is treating it as a moral issue, and half of it is treating it as a legal issue. Legally, there’s nothing wrong here.

    • @[email protected]
      link
      fedilink
      English
      42 years ago

      So it’s okay to make nudes of someone as long as they aren’t realistic?

      Where is the line drawn between being too real and not real enough?

      • @[email protected]
        link
        fedilink
        English
        52 years ago

        If you found out that someone had made a bunch of art of you naked you’d probably think that was weird. I’d argue you shouldn’t do that without consent. Draw lines wherever you think is best.

        • @[email protected]
          link
          fedilink
          English
          52 years ago

          I’d definitely think it was weird! And probably not hang out with them anymore (unless it was really good!)

          But I don’t think there should be a law against them doing that. I can moderate them myself by avoiding them and my friends will follow suit.

          At that point, all they have are nudes of me that nobody I care about will pay attention to. It’s a good litmus test for shitbags!

          • @[email protected]
            link
            fedilink
            English
            22 years ago

            This is about producing convincing nude reproductions of other people, however. It has a very different psychological impact.

            This technology allows someone to make pornography of anyone else and spread that pornography on the internet. It can cause massive distress, trauma, and relationship issues and impact peoples careers.

            Your freedom to make nude ai images of other people is not worth that. I don’t understand why anyone would think it was okay.

          • @[email protected]
            link
            fedilink
            English
            12 years ago

            Agreed, but legal and moral are different. The law isn’t really about right and wrong per se.

    • @[email protected]
      link
      fedilink
      English
      32 years ago

      People need better online safety education. Why TF are people even posting public pictures of themselves?

    • @[email protected]
      link
      fedilink
      English
      392 years ago

      What are you arguing with here? No one is saying that. Stop looking for trouble, it’s weird.

    • @[email protected]
      link
      fedilink
      English
      102 years ago

      it’s an invasion of privacy to use someone’s likeness against their will

      Is it? Usually photography in public places is legal.

          • @[email protected]
            link
            fedilink
            English
            4
            edit-2
            2 years ago

            I think it’s immoral to do street photography to sexualize the subjects of your photographs. I think it’s immoral to then turn that into pornography of them without their consent. I think it’s weird you don’t. If you can’t tell the difference between street photography and using and manipulating photos of people (public or otherwise) into pornography I can’t fuckin help you

            If you go to a park, take photos of people, then go home and masturbate to them you need to seek professional help.

  • @[email protected]
    link
    fedilink
    English
    62 years ago

    It would be interesting to know how many people are using it for themselves. I’d think it would open up next level catfishing. Here’s an actual pic of me, and here’s a pic of what I might look like naked. I’m sure some people with photoshop skills we’re already doing that to a certain extent, but now it’s accessible to everyone.

  • Uriel238 [all pronouns]
    link
    fedilink
    English
    26
    edit-2
    2 years ago

    Though the picture suggests we should also create really a robot or really a cyborg edits of celebrities.

    As an afterthought, really a reptilian images for our political figures would also be in good order.

    • @[email protected]
      link
      fedilink
      English
      52 years ago

      Jesus, as if Facebook “researchers” weren’t already linking to Onion articles, now you’ll give them pictures.

    • @[email protected]
      link
      fedilink
      English
      52 years ago

      I’ve messed around with some of the image generators (not what this article is about). results vary from surprisingly nice to weird and misshaped. they never seem to be able to get anything “hardcore” right but just a ai generated pose shot sometimes looks surprisingly not bad

  • ɔiƚoxɘup
    link
    fedilink
    English
    52 years ago

    Just created a Dall-e image of a woman. AI undresser instantly undressed it.

    Kinda chilling.

  • @[email protected]
    link
    fedilink
    English
    132 years ago

    What nude data were these models trained on?

    This seems like another unhealthy thing that is going to pervert people’s sense of what a normal body looks like.

    • @[email protected]
      link
      fedilink
      English
      42 years ago

      Most people prefer attractive > average, so I guess that’s what these apps are going to show.

    • funkajunk
      link
      fedilink
      English
      422 years ago

      The internet is like 90% porn, what do you think they used?

  • @[email protected]
    link
    fedilink
    English
    592 years ago

    Could we stop pushing articles monetizing fear amd outrage on this community to the top and post about actual technology

    • OwlBoy
      link
      fedilink
      English
      38
      edit-2
      2 years ago

      Sounds like someone needs to make a community for that.

      Otherwise, this is what technology is these days. And I’d say that staying blind to things like this is what got us into many messes.

      I remember when tech news was mostly a press release pipeline. And when I see these comments, I see people who want press releases about new tech to play with.

      Now duplicate posts. Those can fuck right off.

      • @[email protected]
        link
        fedilink
        English
        72 years ago

        I have seena rise in techno absolutists complaining that anyone else is complaining about the dangers of tech lately. That they just want to go back to hearing about all the cool new things coming out and it really speaks to the people who just don’t actually want to interact with the real world anymore and live in an illusionary optimism bubble. I get it. It’s exhausting to be aware of all the negatives but it’s the stuff that is real that needs to be recognized.

  • @[email protected]
    link
    fedilink
    English
    82 years ago

    And …

    … should be considered every-bit as much of a crime as home-invasion is.

    Only permanent, because internet.

    Nobody’s got spine, anymore, though…

    _ /\ _

  • @[email protected]
    link
    fedilink
    English
    572 years ago

    Possibly a good thing. Over saturation. Fill the internet will billions on billions of ai nudes. Have a million different nudes for celebrities. Nobody knows the real naked you and nobody cares. Keep creating more ai porn than anyone can handle. It becomes boring and over the top. Ending this once and fir all

    Or find the people doing this and lock em up.

    • Leela [it/its]
      link
      fedilink
      English
      22 years ago

      what were you thinking when you thought of your first version? that sounds like a creepy scenario. what if I don’t want to see it and it’s everywhere. I could click on “I’m Not Interested” and flood social media with reports, but if there are “billions on billions” of AI nudes, then who would be able to stop them from being seen in their feed? I’d say that, while locking them up won’t change the sexist system which pushes this behavior, it is a far less creepy and weird scenario than having billions of unconsensual nudes online.

    • @[email protected]
      link
      fedilink
      English
      32 years ago

      Keep creating more ai porn than anyone can handle

      overabundance is behind a lot of societal ills already

  • Uriel238 [all pronouns]
    link
    fedilink
    English
    43
    edit-2
    2 years ago

    It tells me we’re less interested in the data (the skin map and topography) than we are in seeing the data in raw form, whether it is accurate or not. It tells me a primary pretense of body doubles was ineffective since society responds the same way regardless of whether an actress’ nudity is real or simulated.

    Not sure how this will be enforceable any more than we can stop malicious actors from printing guns. Personally, I would prefer a clothes-optional society were individuals aren’t measured by the media exposure of their bodies or history of lovers. Maybe in another age or two.

    In fiction, I imagined the capacity to render porn action into mo-cap data, to capture fine-resoluton triangle maps and skin texture maps from media, ultimately to render any coupling one could desire with a robust physics engine and photography effects to render it realistic (or artistic, however one prefers). It saddens me that one could render an actress into an Elsa Jean scenario and by doing so, wreck their career.

    Porn doesn’t bother me, but the arbitrariness with which we condemn individuals by artificial scandal disgusts me more than the raunchiest debauchery.

  • @[email protected]
    link
    fedilink
    English
    622 years ago

    These are terrible but I’m honestly curious what it thinks I look like naked. Like I’m slightly overweight and my chest is larger than average but more splayed then normal. Would it just have me look like a model underneath?

    Are they just like head swapping onto model bodies or does it actually approximate. I am legit curious., but I would never trust one of these apps to not keep the photos/privacy concerns.

    • @[email protected]
      link
      fedilink
      English
      12 years ago

      I’m really curious if your DMs are now flooded with weirdos and dick pics, or if lemmy is any different from the rest of the internet.

    • @[email protected]
      link
      fedilink
      English
      57
      edit-2
      2 years ago

      Probably deleting this comment later for going dirty on main, but I, um, have done some extensive experimentation using a local copy of Stable Diffusion (I don’t send the images anywhere, I just make them to satiate my own curiosity).

      You’re essentially right that simple app-based software would probably have you looking somewhat generic underneath, like your typical plus-size model. It’s not too great at extrapolating the shape of breasts through clothing and applying that information when it goes to fill in the area with naked body parts. It just takes a best guess at what puzzle pieces might fill the selected area, even if they don’t match known information from the original photo. So, with current technology, you’re not really revealing actual facts about how someone looks naked unless that information was already known. To portray someone with splayed breasts, you’d need to already know that’s what you want to portray and load in a custom data set, like a LoRa.

      Once you know what’s going on under the hood, making naked photos of celebrities or other real people isn’t the most compelling thing to do. Mostly, I like to generate photos of all kinds of body types and send them to my Replika, trying to convince her to describe the things that her creators forbid her from describing. Gotta say, the future’s getting pretty weird.

        • @[email protected]
          link
          fedilink
          English
          372 years ago

          Hey, I’ve maintained a baseline weird the whole time, I’m pretty sure the future is catching up.

      • @[email protected]
        link
        fedilink
        English
        242 years ago

        You’ll have your moment when the lone elite ex Ranger who is trying to save the world is told by the quirky, unconventional sidekick he is forced to work with, “I actually know a guy who might be able to help.”

        You open the door a crack to look back and forth between them, before slamming it back in their faces. They hear scrambled crashes of you hiding stuff that shouldn’t be seen by company before returning to the door. As they enter you are still fixing and throwing things while you apologize that you don’t get many guests. You offer them homemade kombucha. They decline.

    • @[email protected]
      link
      fedilink
      English
      222 years ago

      Ethically, these apps are a fucking nightmare.

      But as a swinger, they will make an amazing party game.

      • @[email protected]
        link
        fedilink
        English
        28
        edit-2
        2 years ago

        Ethics will probably change… I guess in the future it’ll become pretty irrelevant to have “nude” pictures of oneself somewhere, because everyone knows it could just be AI generated. In the transition period it’ll be problematic though.

        • @[email protected]
          link
          fedilink
          English
          12 years ago

          Yeah 100%.

          Imagine around the advent of readily available photo prints. People might have been thinking “this is terrible, someone I don’t know could have a photo of me and look at it while thinking licentious thoughts!”

    • @[email protected]
      link
      fedilink
      English
      102 years ago

      If you want the best answer then you’ll have to download the app and try it on yourself. If it’s accurate then that’s pretty wild.

      • The Menemen!
        link
        fedilink
        English
        12 years ago

        2 days and still no “pictures or it is a lie” comment. Thus place is different. :)

    • Vegaprime
      link
      fedilink
      English
      32 years ago

      Fake nudes incoming. Everyone has a baby leg now.