A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.

    • @lloram239@feddit.de
      link
      fedilink
      English
      3
      edit-2
      2 years ago

      The images on the site aren’t very good (typical low-detail airbrushed-look) nor are they generated very fast. See the examples here (mostly SFW) on what you can actually do, it takes about 15sec per image on a mid range gaming PC.

      That said, one big limit of current AI models remains: It’s always images of a single subject, it can’t do multiple subjects or complex interaction. Also facial expressions still always look quite bland. It can be worked around with inpainting and stuff, but plain text prompts have a hard time generating interesting images.

    • @Buddahriffic@lemmy.world
      link
      fedilink
      English
      292 years ago

      Some things I was able to put my finger on after looking at a bunch of the images in the feed:

      • It doesn’t do skin well, treating it more like a smooth plastic than a surface with pores, wrinkles, and fine hairs.
      • It doesn’t understand lighting, so shadows don’t agree with highlights or even each other.
      • It doesn’t understand anatomy. A lot of the images were fine in this regard but others had misplaced muscles, bones, and impossible limb positioning/truncation.
      • It has no idea how to draw vaginas. Nipples are also not well understood, though it does better on average with those. They still look more like a plastic than skin, but most of them were passable at least, while I didn’t see a single vagina that looked even close to right.
      • Cethin
        link
        fedilink
        English
        202 years ago

        To be fair, so many humans drawing porn don’t understand anatomy or what vaginas look like. It’s hard to train when your input data is already bad.

        • @Buddahriffic@lemmy.world
          link
          fedilink
          English
          72 years ago

          Yeah, a lot of the vaginas on that site look like hentai vaginas. I can understand it more with AIs since vaginas have a ton of variance to them, so trying to make an “average” vagina will end up not looking like any actual one.

          But the artists that draw them like that just disappoint me (though with the caveat that I have no reason to believe I’d do any better if I were to draw that). There’s a ton of inspiration out there and many do an amazing job with the rest of the body, why do they make one of the important parts (for porn) look like an afterthought?

          Though the anime style ones aren’t as bad as some others where the AI seems like it’s trying to average boy and girl parts lol.

          • Piecemakers
            link
            fedilink
            English
            32 years ago

            To be fair, the focus of this era’s porn has hardly ever been the vagina, under any amount of scrutiny. A few aspects of sex in porn prioritized higher are, in no particular order: bounce, sounds, phrasing, setting, texture, animism/passion, power roles, etc. Hell, I’d rather that more effort has been made to visually hide prophylactic use than to focus on the vagina itself.

            I’m not in any way saying I agree with this, simply pointing out the facts as they are these days.

          • Spaz
            link
            fedilink
            English
            62 years ago

            Tbh, if people don’t know the difference from a vagina and a vulva, I don’t expect AI to do a great job generating any good porn.

            • @Buddahriffic@lemmy.world
              link
              fedilink
              English
              12 years ago

              Using a term casually rather than medically won’t affect the quality of AI porn. Though maybe ensuring it knows the difference between a labia, clit, urethra, vagina, and asshole will produce better results.

  • magnetosphere
    link
    fedilink
    9
    edit-2
    2 years ago

    I think words like “thirsty” and “thirst trap” are self defeating. Actual thirst is your body’s biological response to a need for water. Without water, you’ll die. Calling porn sites “thirst traps” suggests that people have a critical need for porn - which is often contrary to the point the author is trying to make.

    Arguing about the ethics or morality of something that’s “necessary” for survival is irrelevant. Anyone who’s against pornography is perfectly within their rights to share their opinion, but they should avoid the word “thirst”.

    Edit: I know the issue of pornography can evoke strong feelings in some, but I’m only talking about word choice, folks.

    I’m also not one of those tiresome people people who refuses to admit that they’re wrong, and acts like a child when faced with sound opposition. If someone would actually put their opinion into words, instead of just downvoting, I’d appreciate it.

    • JackbyDev
      link
      fedilink
      English
      22 years ago

      It’s slang. There are tons of words in every language that don’t make sense but get used a certain way because that’s how people use them. Next thing you’re gonna tell me is that in the 1980s when people called things bad you were upset because they actually meant good.

      • magnetosphere
        link
        fedilink
        62 years ago

        A redditor would try to hide their gross opinion behind a thin veneer of logic. I’m not getting into the ethics of porn at all.

        I’m just talking about phrasing. I think “thirst” is a poor analogy.

        • JackbyDev
          link
          fedilink
          English
          32 years ago

          I think words like “thirsty” and “thirst trap” are self defeating. Actual thirst is your body’s biological response to a need for water. Without water, you’ll die. Calling porn sites “thirst traps” suggests that people have a critical need for porn - which is often contrary to the point the author is trying to make.

          a thin veneer of logic

    • @Ropianos@feddit.de
      link
      fedilink
      English
      4
      edit-2
      2 years ago

      Consider another association with thirst: Desperation. In the mind of the author porn consumption is negative so anyone consuming porn is doing this out of desperation, despite knowing better. It essentially describes people being controlled by their base instincts. And thus this site is a trap, luring people against their will.

      That is how I would interpret the word thirst in this context anyway. It’s not about a critical need, it’s about thirst being irrational and highly compulsive.

      • magnetosphere
        link
        fedilink
        62 years ago

        Awesome! Thank you! This is the kind of thought provoking response I was hoping for.

        Additionally, it’s a really good point. I totally missed this interpretation, and I think it’s better than mine.

        • JackbyDev
          link
          fedilink
          English
          22 years ago

          That’s hardly even an interpretation. Thirst and hunger as verbs have always meant to desire something.

        • @Ropianos@feddit.de
          link
          fedilink
          English
          1
          edit-2
          2 years ago

          Sure, no worries! I haven’t been disappointed yet by responding to downvoted comments so I will keep doing it :)

          Just another similar metaphor: Power hungry. Not the stove kind but the dictator kind. To be honest, there are quite a lot of body related metaphors, e.g. drowning in trouble, blinded by ambition. I guess it comes down to evoking some strong emotion.

          • magnetosphere
            link
            fedilink
            32 years ago

            I think a lot of people (myself included) are used to the combative, hostile environment of reddit. An actual friendly disagreement is a foreign concept to them. Personally, I enjoy it when other people present different perspectives. It’s both interesting and a great way to learn.

            And yeah, “power hungry” has a similar vibe. Nice!

            • @Ropianos@feddit.de
              link
              fedilink
              English
              12 years ago

              Totally agree, though I haven’t really participated that much on Reddit. Seems like any disagreement is quickly framed as trolling over there.

              • magnetosphere
                link
                fedilink
                22 years ago

                Yeah. People are very defensive, because even the insinuation of disagreement is taken as a personal attack. It gets into your head, and takes a while to get over. You forget that reasonable, well-meaning conversations are even possible. It sucks. I’ll never go back.

  • @just_another_person@lemmy.world
    link
    fedilink
    English
    292 years ago

    At what point was porn NOT graphic, but now this thing IS GRAPHIC. Are we talking all caps, or just a small difference between the live stuff and the AI shit? Inquiring minds want to know.

  • @sramder@lemmy.world
    link
    fedilink
    English
    352 years ago

    People who insist on real flesh porn will ultimately be viewed as weirdo’s out of touch with reality like people who insist everything sounds better on vinyl.

    Fast forward 25 years past the first Ai war and a ragged but triumphant humanity must rediscover the lost art of waxing.

    • @Harpsist@lemmy.world
      link
      fedilink
      English
      12 years ago

      Why would I want to encourage the flesh trade where real women are hurt? And are limited to what humans are physically capable of?

      When I can have AI generated people who are able to do anything imaginable and no one gets hurt?

      They’ll be arguments that ‘once people get used to the fantasies they’ll want to try it in real life’ but we all know that that just isn’t true fr 40 years of video games. There hasn’t been any uptick in the events of people eating mushrooms and jumping on turtles or - what ever the fuck a goomba is -

    • @Knusper@feddit.de
      link
      fedilink
      English
      112 years ago

      Well, to develop such a service, you need training data, i.e. lots of real child pornography in your possession.

      Legality for your viewers will also differ massively around the world, so your target audience may not be very big.

      And you probably need investors, which likely have less risky projects to invest into.

      Well, and then there’s also the factor of some humans just not wanting to work on disgusting, legal grey area stuff.

      • @Womble@lemmy.world
        link
        fedilink
        English
        192 years ago

        yup, just like the ai needed lots of pictures of astronaughts on horses to make pictures of those…

        • @JonEFive@midwest.social
          link
          fedilink
          English
          62 years ago

          Exactly. Some of these engines are perfectly capable of combining differing concepts. In your example, it knows basically what a horse looks like, and what a human riding on horseback looks like. It also knows that an astronaut looks very much like a human without a space suit and can put the two together.

          Saying nothing of the morality, In this case, I suspect that an AI could be trained using pictures of clothed children perhaps combined with nude images of people who are of age and just are very slim or otherwise have a youthful appearance.

          While I think it’s repugnent in concept, I also think that for those seeking this material, I’d much rather it be AI generated than an actual exploited child. Realistically though, I doubt that this would actually have any notable impact to the prevalence of CSAM, and might even make it more accessible.

          Furthermore, if the generative AI gets good enough, it could make it difficult to determine whether an image is real or AI generated. That would make it more difficult for police to find the child and offender to try to remove them from that situation. So now we need an AI to help analyze and separate the two.

          Yeah… I don’t like living in 2023 and things are only getting worse. I’ve put way more thought into this than I ever wanted to.

          • @Ryantific_theory@lemmy.world
            link
            fedilink
            English
            32 years ago

            Aren’t AI generated images pretty obvious to detect from noise analysis? I know there’s no effective detection for AI generated text, and not that there won’t be projects to train AI to generate perfectly realistic images, but it’ll be a while before it does fingers right, let alone invisible pixel artifacts.

            As a counterpoint, won’t the prevalence of AI generated CSAM collapse the organized abuse groups, since they rely on the funding from pedos? If genuine abuse material is swamped out by AI generated imagery, that would effectively collapse an entire dark web market. Not that it would end abuse, but it would at least undercut the financial motive, which is progress.

            That’s pretty good for 2023.

            • JackbyDev
              link
              fedilink
              English
              22 years ago

              With StableDiffusion you can intentionally leave an “invisible watermark” that machines can easily detect but humans cannot see. The idea being that in the future you don’t accidentally train on already AI generated images. I’d hope most sites are doing that but it can be turned off easily enough. Apart from that I’m not sure.

              • @Ryantific_theory@lemmy.world
                link
                fedilink
                English
                12 years ago

                I could have sworn I saw an article talking about how there were noise artifacts that were fairly obvious, but now I can’t turn anything up. The watermark should help things, but outside of that it looks like there’s just a training dataset of pure generative AI images (GenImage) to train another AI to detect generated images. I guess we’ll see what happens with that.

      • @d13@programming.dev
        link
        fedilink
        English
        52 years ago

        Unfortunately, no, you just need training data on children in general and training data with legal porn, and these tools can combine it.

        It’s already being done, which is disgusting but not surprising.

        People have worried about this for a long time. I remember a subplot of a sci-fi series that got into this. (I think it was The Lost Fleet, 15 years ago).

    • 👁️👄👁️
      link
      fedilink
      English
      22 years ago

      You’d also have to convince them that it’s not real. It’ll probably end up creating laws tbh. Then there are weird things like Japan where lolis are legal, but uncensored genitals aren’t, even drawn.

    • @inspxtr@lemmy.world
      link
      fedilink
      English
      92 years ago

      I remember reading that this may be already happening to some extent, eg people sharing tips on creating it on the deep web, maybe through prompt engineer, fine tuning or pretraining.

      I don’t know how those models are made, but I do wonder the ones that need retraining/finetuning by using real csam can be classified as breaking the law.

        • JackbyDev
          link
          fedilink
          English
          12 years ago

          If a search engine cannot index it then it is the deep web. So yes, Discord chats are technically part of the deep web.

            • JackbyDev
              link
              fedilink
              English
              2
              edit-2
              2 years ago

              Wikipedia on the deep web

              The deep web,[1] invisible web,[2] or hidden web[3] are parts of the World Wide Web whose contents are not indexed by standard web search-engine programs.

              Try accessing a Discord channel through your browser without being logged in. They aren’t indexed by search engines because you have to be logged in.

                • JackbyDev
                  link
                  fedilink
                  English
                  12 years ago

                  I don’t care about some arbitrary challenge to get money from you. I’m trying to get you to think critically. If search engines like Google don’t index it then it’s part of the deep web. Just because things like Discord aren’t what people typically mean when people talk about the deep web doesn’t make Discord chats not part of the deep web.

    • mrnotoriousman
      link
      fedilink
      52 years ago

      There was an article the other day about underage girls in France having AI nudes spread around based on photos as young as 12. Definitely harm there.

    • @drekly@lemmy.world
      link
      fedilink
      English
      92 years ago

      CivitAI is a pretty perverted site at the best of times. But there’s a disturbing amount of age adjustment plugins to make images of children on the same site they have plugins to make sex acts. It’s clear some people definitely are.

      • oats
        link
        fedilink
        English
        42 years ago

        Some models also prefer children for some reason and then you have to put mature/adult in positive prompt and child in negative

        • @lloram239@feddit.de
          link
          fedilink
          English
          3
          edit-2
          2 years ago

          I think part of the problem is that there is a lot of anime in the models and when you don’t filter that out with negative prompts it can distort the proportions of realistic images (e.g. everybody gets huge breasts unless you negative prompt it away). In general models are always heavily biased towards what they were trained on, and when you use a prompt or LORA that worked well on one model on another, you can get weird results. There is always a lot of nudging involved with keywords and weights to get the images to were you want it.

    • Rustmilian
      link
      fedilink
      8
      edit-2
      2 years ago

      Hentai maybe. But realistic shit is 100% illegal, even just making such an AI would require breaking the law as you’d have to use real CSAM to train it.

  • @RBWells@lemmy.world
    link
    fedilink
    English
    112 years ago

    Meh. It’s all only women and so samey samey. Not sexy IMO, but I don’t think fake is necessarily not hot, art can be, certainly.

    • @Zerfallen@lemmy.world
      link
      fedilink
      English
      42 years ago

      You can change it to men, but most of the results are intersex(?) or outright women anyway. I guess the training data is heavily weighted toward examples of women.

  • themeatbridge
    link
    fedilink
    English
    152 years ago

    Does it say something about society that our automatons are better at creating similated genitals than they are at hands?

    • @douglasg14b@lemmy.world
      link
      fedilink
      English
      8
      edit-2
      2 years ago

      It says that we are biologically predisposed to sex, which we are, like animals, which we are.

      It doesn’t say anything about society, it just confirms the human condition.

    • @lloram239@feddit.de
      link
      fedilink
      English
      4
      edit-2
      2 years ago

      They suck quite a lot at genitals too. But what makes hands especially tricky is simply that they are pretty damn complex. A hand has five fingers that can all move independently, the hand can rotate in all kinds of way and the individual parts of a hand can all occlude each other. There is a lot of stuff you have to get right to produce a good looking hand and it is especially difficult when you are just a simple 2D algorithm that has little idea of 3D structure or motion.

    • @Bop@lemmy.film
      link
      fedilink
      English
      22 years ago

      On a visual level, we are more interested in genitals than hands? Also, faces.

  • @Harpsist@lemmy.world
    link
    fedilink
    English
    22 years ago

    I have been waiting this day for decades ever since I first heard about AI generated images a decade or so ago.

  • Seraph
    link
    fedilink
    92 years ago

    While amazing, most of these are hilariously wrong. Let’s just say the girl with 4 bellybuttons was the more tame incorrect thing I saw.