A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • Fernlike
    link
    fedilink
    608 months ago

    A generative AI model doesn’t require the exact thing it creates in its datasets. It most likely just combined regular nudity with a picture of a child.

    • finley
      link
      fedilink
      English
      98 months ago

      In that case, the images of children were still used without their permission to create the child porn in question

      • Fernlike
        link
        fedilink
        78 months ago

        That’s a whole other thing than the AI model being trained on CSAM. I’m currently neutral on this topic so I’d recommend you replying to the main thread.

          • Fernlike
            link
            fedilink
            16
            edit-2
            8 months ago

            It’s not CSAM in the training dataset, it’s just pictures of children/people that are already publicly available. This goes on to the copyright side of things of AI instead of illegal training material.

            • finley
              link
              fedilink
              English
              5
              edit-2
              8 months ago

              It’s images of children used to make CSAM. No matter of your mental gymnastics can change that, nor the fact that those children’s consent was not obtained.

              Why are you trying so hard to rationalize the creation of CSAM? Do you actually believe there is a context in which CSAM is OK? Are you that sick and perverted?

              Because it really sounds like that’s what you’re trying to say, using copyright law as an excuse.

              • @[email protected]
                link
                fedilink
                318 months ago

                It’s every time with you people, you can’t have a discussion without accusing someone of being a pedo. If that’s your go-to that says a lot about how weak your argument is or what your motivations are.

                • finley
                  link
                  fedilink
                  English
                  28 months ago

                  It’s hard to believe someone is not a pedo when they advocate so strongly for child porn

                  • ObjectivityIncarnate
                    link
                    fedilink
                    118 months ago

                    You’re just projecting your unwillingness to ever take a stance that doesn’t personally benefit you.

                    Some people can think about things objectively and draw a conclusion that makes sense to them without personal benefit being a primary determinant of said conclusion.

                  • @[email protected]
                    link
                    fedilink
                    118 months ago

                    its hard to argue with someone who believes the use of legal data to create more data is ever illegal.

              • Fernlike
                link
                fedilink
                198 months ago

                I am not trying to rationalize it, I literally just said I was neutral.

                • finley
                  link
                  fedilink
                  English
                  28 months ago

                  How are you neutral about child porn? The vast majority of everyone on this planet is very much against it.

                  • Fernlike
                    link
                    fedilink
                    108 months ago

                    I’m not neutral about child porn, I’m very much against it, stop trying to put words in my mouth. I’m talking about this kind of use of AI could be in the very same category of loli imagery, since these are not real child sexual abuse material.

                • finley
                  link
                  fedilink
                  English
                  28 months ago

                  It’s hard to believe you’re not a pedophile when you advocate so strongly for child porn.

      • @[email protected]
        link
        fedilink
        338 months ago

        That’s not really a nuanced take on what is going on. A bunch of images of children are studied so that the AI can learn how to draw children in general. The more children in the dataset, the less any one of them influences or resembles the output.

        Ironically, you might have to train an AI specifically on CSAM in order for it to identify the kinds of images it should not produce.

        • finley
          link
          fedilink
          English
          38 months ago

          “Nuh uh!” Is a pretty weak argument

      • @[email protected]
        link
        fedilink
        English
        88 months ago

        Good luck convincing the AI advocates of this. They have already decided that all imagery everywhere is theirs to use however they like.