Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • southsamurai
    cake
    link
    fedilink
    472 years ago

    These articles are written by idiots, serving the whims of a corporate stooge to try and smear any other than corporate services and it isn’t even thinly veiled. Look at who this all comes from

    • @[email protected]
      link
      fedilink
      62 years ago

      Its weird how this headline shows up only when other headlines start covering how popular Mastadon is now.

      Coincidence? Sure smells like it. God, I love astroturfing in the morning.

    • TWeaK
      link
      fedilink
      72 years ago

      The article written by WaPo and regurgitated by The Verge is crap, but the study from Stanford is solid. However, it’s nowhere near as doom and gloom as the articles, and suggests plenty of ways to improve things. Primarily they suggest better tools for moderation.

      • @[email protected]
        cake
        link
        fedilink
        32 years ago

        The study from Stanford conflates pencil drawings of imaginary characters with actual evidence of child rape.

        Half the goddamn point of saying CSAM instead of CP is to make that difference blindingly obvious. Somehow, they still missed it. Somehow they are talking about sexual abuse as if it’s something that can happen to pixels.

  • @[email protected]
    link
    fedilink
    -7
    edit-2
    2 years ago

    The article points out that the strength of the Fediverse is also it’s downside. Federated moderation makes it challenging to consistently moderate CSAM.

    We have seen it even here with the challenges of Lemmynsfw. In fact they have taken a stance that CSAM like images with of age models made to look underage is fine as long as there is some dodgy ‘age verification’

    The idea is that abusive instances would get defederated, but I think we are going to find that inadequate to keep up without some sort of centralized reporting escalation and ai auto screening.

    • rodneylives
      link
      fedilink
      152 years ago

      The problem with screening by AI is there’s going to be false positives, and it’s going to be extremely challenging and frustrating to fight them. Last month I got a letter for a speeding infraction that was automated: it was generated by a camera, the plate read in by OCR, the letter I received (from “Seat Pleasant, Maryland,” lol) was supposedly signed off by a human police officer, but the image was so blurry that the plate was practically unreadable. Which is what happened: it got one of the letters wrong, and I got a speeding ticket from a town I’ve never been to, had never even heard of before I got that letter. And the letter was full of helpful ways to pay for and dispense with the ticket, but to challenge it I had to do it it writing, there was no email address anywhere in the letter. I had to go to their website and sift through dozens of pages to find one that had any chance of being able to do something about it, and I made a couple of false steps along the way. THEN, after calling them up and explaining the situation, they apologized and said they’d dismiss the charge–which they failed to do, I got another letter about it just TODAY saying a late fee had now been tacked on.

      And this was mere OCR, which has been in use for multiple decades and is fairly stable now. This pleasant process is coming to anything involving AI as a judging mechanism.

      • @[email protected]
        link
        fedilink
        82 years ago

        Off topic, but a few years ago a town in Tennessee had their speed camera contractor screw up in this way. Unfortunately for them, they tagged an elderly couple whose son was a very good attorney. He sued the town for enough winnable civil liability to bankrupt them and force them to disincorporate.

        Speed cameras are all but completely illegal in TN now.

        • Madison_rogue
          link
          fedilink
          6
          edit-2
          2 years ago

          When I lived in Clarksville, they had intersection cameras to ticket anyone that ran a red light. Couple problems with it.

          1. Drivers started slamming on their brakes; causing more accidents
          2. The city outsourced the cameras, so they received only pennies on the dollar for every ticket.

          I think they eventually removed them, but I can’t recall. I visited last September to take a class for work, and I didn’t see any cameras, so they might be gone.

          • @[email protected]
            link
            fedilink
            32 years ago

            Dresden’s House rep got a bill passed several years ago to outlaw them. The Australian vendor’s lobbyists managed to get a carve out for school zones and blind curves, but I haven’t even seen any of those in years.

            • Madison_rogue
              link
              fedilink
              32 years ago

              Wait…so the company that supplied the cameras wasn’t even from the U.S.?

              Wow…this just gets more insane the more I learn about it. As conservative as Tennessee can be, they first outsourced their law enforcement for a return of pennies on the dollar to the city, AND the taxpayers ended up subsidizing a foreign company.

      • @[email protected]
        link
        fedilink
        7
        edit-2
        2 years ago

        THEN, after calling them up and explaining the situation, they apologized and said they’d dismiss the charge–which they failed to do

        That sounds about right. When I was in college I got a speeding ticket halfway in between the college town and the city my parents lived in. Couldn’t afford the fine due to being a poor college student, and called the court and asked if an extension was possible. They told me absolutely, how long do you need, and then I started saving up. Shortly before I had enough, I got a call from my Mom that she had received a letter saying there was a bench warrant for my arrest over the fine

  • blazera
    link
    fedilink
    142 years ago

    So what im reading is they didnt actually look at any images, they found hashtags, undisclosed hashtags at that. So basically we’ve no idea what they think they found, for all we know cartoon might’ve been one of the tags

    • @[email protected]
      link
      fedilink
      4
      edit-2
      2 years ago

      Or maybe it’s better to err on the side of caution when it comes to maybe one of the worst legal offences you can do?

      I’m tired of people harping on this decision when it’s a perfectly legitimate one from a legal standpoint. There’s a reason tons of places are very iffy about nsfw content.

  • @[email protected]
    link
    fedilink
    382 years ago

    Seems odd that they mention Mastodon as a Twitter alternative in this article, but do not make any mention of the fact that Twitter is also rife with these problems, more so as they lose employees and therefore moderation capabilities. These problems have been around on Twitter for far longer, and not nearly enough has been done.

    • @[email protected]
      link
      fedilink
      11
      edit-2
      2 years ago

      The actual report is probably better to read.

      It points out that you upload to one server, and that server then sends the image to thousands of others. How do those thousands of others scan for this? In theory, using the PhotoDNA tool that large companies use, but then you have to send the every image to PhotoDNA thousands of times, once for each server (because how do you trust another server telling you it’s fine?).

      The report provides recommendations on how servers can use signatures and public keys to trust scan results from PhotoDNA, so images can be federated with a level of trust. It also suggests large players entering the market (Meta, Wordpress, etc) should collaborate to build these tools that all servers can use.

      Basically the original report points out the ease of finding CSAM on mastodon, and addresses the challenges unique to federation including proposing solutions. It doesn’t claim centralised servers have it solved, it just addresses additional challenges federation has.

    • @[email protected]
      cake
      link
      fedilink
      272 years ago

      4.1 Illustrated and Computer-Generated CSAM

      Stopped reading.

      Child abuse laws “exclude anime” for the same reason animal cruelty laws “exclude lettuce.” Drawings are not children.

      Drawings are not real.

      Half the goddamn point of saying CSAM instead of CP is to make clear that Bart Simpson doesn’t count. Bart Simpson is not real. It is fundamentally impossible to violate Bart Simpson’s rights, because he doesn’t fucking exist. There is nothing to protect him from. He cannot be harmed. He is imaginary.

      This cannot be a controversial statement. Anyone who can’t distinguish fiction from real life has brain problems.

      You can’t rape someone in MS Paint. Songs about murder don’t leave a body. If you write about robbing Fort Knox, the gold is still there. We’re not about to arrest Mads Mikkelsen for eating people. It did not happen. It was not real.

      If you still want to get mad at people for jerking off to the wrong fantasies, that is an entirely different problem from photographs of child rape.

      • @[email protected]
        link
        fedilink
        2
        edit-2
        2 years ago

        Okay, thanks for the clarification

        Everyone except you still very much includes drawn & AI pornographic depictions of children within the basket of problematic content that should get filtered out of federated instances so thank you very much but I’m not sure your point changed anything.

        • @[email protected]
          cake
          link
          fedilink
          -12 years ago

          If you don’t think images of actual child abuse, against actual children, is infinitely worse than some ink on paper, I don’t care about your opinion of anything.

          You can be against both. Don’t ever pretend they’re the same.

          • @[email protected]
            link
            fedilink
            -12 years ago

            Hey, just because someone has a stupid take on one subject doesn’t mean they have a stupid take on all subjects. Attack the argument, not the person.

            • @[email protected]
              cake
              link
              fedilink
              12 years ago

              Some confused arguments reveal confused people. Some terrible arguments reveal terrible people. For example: I don’t give two fucks what Nazis think. Life’s too short to wonder which subjects they’re not facile bastards about.

              If someone’s motivation for making certain JPEGs hyper-illegal is “they’re icky” - they’ve lost benefit of the doubt. Because of their decisions, I no longer grant them that courtesy.

              Demanding pointless censorship earns my dislike.

              Equating art with violence earns my distrust.

              • @[email protected]
                link
                fedilink
                12 years ago

                Perhaps. But pretty much everyone has a stupid take on something.

                There’s obviously a limit there, but most people can be reasoned with. So instead of jumping to a conclusion, attempt a dialogue first until they prove that they can’t be reasoned with. This is especially true on SM where, even if you can’t convince the person you’re talking with, you may just convince the next person to come along.

                • @[email protected]
                  cake
                  link
                  fedilink
                  02 years ago

                  Telling someone why they’re a stupid bastard for the sake of other people is not exactly a contradiction. You know what doesn’t do observers any good? “Debating” complete garbage, in a way that lends it respect and legitimacy. Sometimes you just need to call bullshit.

                  Some bullshit is so blatant that it’s a black mark against the bullshitter.

            • @[email protected]
              link
              fedilink
              0
              edit-2
              2 years ago

              He invented the stupid take he’s fighting against. Nobody equated “ink on paper” with “actual rape against children”.

              The bar to cross to be filtered out of the federation isn’t rape. Lolicon is already above the threshold, it’s embarrassing that he doesn’t realize that.

              • @[email protected]
                link
                fedilink
                12 years ago

                I don’t think the OP ever said the bar was rape, the OP said the article and the person they responded to are treating drawn depictions of imaginary children the same as depictions of actual children. Those are not the same thing at all, yet many people seem to combine them (apparently including US law as of the Protect Act of 2003).

                Some areas make a distinction (e.g. Japan and Germany), whereas others don’t. Regardless of the legal status in your area, the two should be treated separately, even if that means both are banned.

                • @[email protected]
                  link
                  fedilink
                  1
                  edit-2
                  2 years ago

                  “treating them the same” => The threshold for being refused entry into mainstream instances is just already crossed at the lolicon level.

                  From the perspective of the fediverse, pictures of child rape and lolicon should just both get you thrown out. That doesn’t mean you’re “treating them the same”. You’re just a social network. There’s nothing you can do above defederating.

              • @[email protected]
                cake
                link
                fedilink
                -12 years ago

                We’re not just talking about ‘ew gross icky’ exclusion from a social network. We’re talking about images whose possession is a felony. Images that are unambiguously the product of child rape.

                This paper treats them the same. You’re defending that false equivalence. You need to stop.

                • @[email protected]
                  link
                  fedilink
                  1
                  edit-2
                  2 years ago

                  Who places the bar for “exclusion from a social network” at felonies? Any kind child porn has no place on the fediverse, simulated or otherwise. That doesn’t mean they’re equal offenses, you’re just not responsible for carrying out anything other than cleaning out your porch.

                • @[email protected]
                  cake
                  link
                  fedilink
                  22 years ago

                  ‘Everyone but you agrees with me!’ Bullshit.

                  ‘Nobody wants this stuff that whole servers exist for.’ Self-defeating bullshit.

                  ‘You just don’t understand.’ Not an argument.

        • @[email protected]
          link
          fedilink
          52 years ago

          They are not saying it shouldn’t be defederated, they are saying reporting this to authorities is pointless and that considering CSAM is harmful.

            • @[email protected]
              link
              fedilink
              22 years ago

              What’s the point of reporting it to authorities? It’s not illegal, nor should it be because there’s no victim, so all reporting it does is take up valuable time that could be spent tracking down actual abuse.

              • @[email protected]
                link
                fedilink
                1
                edit-2
                2 years ago

                It’s illegal in a lot of places including where I live.

                In the US you have the protect act of 2003

                (a) In General.—Any person who, in a circumstance described in subsection (d), knowingly produces, distributes, receives, or possesses with intent to distribute, a visual depiction of any kind, including a drawing, cartoon, sculpture, or painting, that— (1) (A) depicts a minor engaging in sexually explicit conduct; and (B) is obscene; or (2) (A) depicts an image that is, or appears to be, of a minor engaging in graphic bestiality, sadistic or masochistic abuse, or sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex; and (B) lacks serious literary, artistic, political, or scientific value; or attempts or conspires to do so, shall be subject to the penalties provided in section 2252A(b)(1), including the penalties provided for cases involving a prior conviction.

                Linked to the obscenity doctrine

                https://www.law.cornell.edu/uscode/text/18/1466A

                • @[email protected]
                  link
                  fedilink
                  1
                  edit-2
                  2 years ago

                  Wow, that’s absolutely ridiculous, thanks for sharing! That would be a very unpopular bill to get overturned…

                  I guess it fits with the rest of the stupidly named bills. It doesn’t protect anything, it just prosecutes undesirable behaviors.

            • @[email protected]
              link
              fedilink
              32 years ago

              Definitions of CSAM definitely do not include illustrated and simulated forms. They do not have a victim and therefore cannot be abuse. I agree that it should not be allowed on public platforms, hence why all instances hosting it should be defederated. Despite this, it is not illegal, so reporting it to authorities is a waste of time for you and the authorities who are trying to remove and prevent actual CSAM.

              • @[email protected]
                link
                fedilink
                1
                edit-2
                2 years ago

                CSAM definitions absolutely include illustrated and simulated forms. Just check the sources on the wikipedia link and climb your way up, you’ll see “cartoons, paintings, sculptures, …” in the wording of the protect act

                They don’t actually need a victim to be defined as such

                • @[email protected]
                  link
                  fedilink
                  12 years ago

                  That Wikipedia broader is about CP, a broader topic. Practically zero authorities will include illustrated and simualated forms of CP in their definitions of CSAM

        • @[email protected]
          cake
          link
          fedilink
          32 years ago

          What does that even mean?

          There’s nothing to “cover.” They’re talking about illustrations of bad things, alongside actual photographic evidence of actual bad things actually happening. Nothing can excuse that.

          No shit they are also discussing actual CSAM alongside… drawings. That is the problem. That’s what they did wrong.

      • @[email protected]
        link
        fedilink
        52 years ago

        Oh, wait, Japanese in the other comment, now I get it. This conversation is a about AI Loli porn.

        Pfft, of course, that’s why no one is saying the words they mean, because it suddenly becomes much harder to take the stance since hatred towards Loli Porn is not universal.

        • @[email protected]
          link
          fedilink
          42 years ago

          I mean, I think it’s disgusting, but I don’t think it should be illegal. I feel the same way about cigarettes, 2 girls 1 cup, and profane language. It’s absolutely not for me, but that shouldn’t make it illegal.

          As long as there’s no victim, knock yourself out with whatever disgusting, weird stuff you’re into.

      • Mark
        link
        fedilink
        12 years ago

        Oh no, what you describe is definitely illegal here in Canada. CSAM includes depictions here. Child sex dolls are illegal. And it should be that way because that stuff is disgusting.

        • @[email protected]
          cake
          link
          fedilink
          -2
          edit-2
          2 years ago

          CSAM includes depictions here.

          Literally impossible.

          Child rape cannot include drawings. You can’t sexually assault a fictional character. Not “you musn’t.” You can’t.

          If you think the problem with child rape amounts to ‘ew, gross,’ fuck you. Your moral scale is broken, if there’s not a vast gulf between those two bad things.

    • Lemdee
      link
      fedilink
      612 years ago

      So if I’m understanding right, based on their recommendations this will all be addressed as more moderation and QOL tools are introduced as we move further down the development roadmap?

      • @[email protected]
        link
        fedilink
        -782 years ago

        What development roadmap? You’re not a product manager and this isn’t a Silicon Valley startup.

        • Fuck Lemmy.World
          link
          fedilink
          1212 years ago

          What makes you think that development roadmaps are exclusive to Silicon Valley startup product managers, and not just a general practice in software engineering?

          Mastodon actually does have a roadmap, and you can find it here: https://joinmastodon.org/roadmap

          • @[email protected]
            link
            fedilink
            32 years ago

            As does most successful open source software. It’s more of a "this is where we’d like to see things go long term, but that in no way restricts contributions, it merely helps communicate the ideas of the core contributors.

    • @[email protected]
      link
      fedilink
      English
      492 years ago

      If I can try to summarize the main findings:

      1. Computer-generated (e.g…, Stable Diffusion) child porn is not criminalized in Japan, and so many Japanese Mastodon servers don’t remove it
      2. Porn involving real children is removed, but not immediately, as it depends on instance admins to catch it, and they have other things to do. Also, when an account is banned, the Mastodon server software is not sending out a “delete” for all of their posted material (which would signal other instances to delete it)

      Problem #2 can hopefully be improved with better tooling. I don’t know what you do about problem #1, though.

      • @[email protected]
        link
        fedilink
        English
        42 years ago

        I don’t know what you do about problem #1, though.

        Well the simple answer is that it doesn’t have to be illegal to remove it.

        The legal question is a lot harder, considering AI image generation has reached levels that are almost indistinguishable from reality.

        • @[email protected]
          link
          fedilink
          English
          3
          edit-2
          2 years ago

          In which case, admins should err on the side of caution and remove something that might be illegal.

          I personally would prefer to have nothing remotely close to CSAM, but as long as children aren’t being harmed in any conceivable way, I don’t think it would be illegal to post art containing children. But communities should absolutely manage things however they think is best for their community.

          In other words, I don’t think #1 is a problem at all, imo things should only be illegal if there’s a clear victim.

      • CaptainBasculin
        link
        fedilink
        English
        102 years ago

        Such a signal exists in the ActivityPub protocol, so I wonder why it’s not being used.

      • @[email protected]
        link
        fedilink
        English
        272 years ago

        One option would be to decide that the underlying point of removing real CSAM is to avoid victimizing real children; and that computer-generated images are no more relevant to this goal than Harry/Draco slash fiction is.

        • @[email protected]
          link
          fedilink
          English
          12 years ago

          And are you able to offer any evidence to reassure us that simulated child pornography doesn’t increase the risk to real children as pedophiles become normalised to the content and escalate (you know, like what already routinely happens with regular pornography)?

          Or are we just supposed to sacrifice children to your gut feeling?

          • @[email protected]
            link
            fedilink
            English
            1
            edit-2
            2 years ago

            Would you extend the same evidence-free argument to fictional stories, e.g. the Harry/Draco slash fiction that I mentioned?

            For what it’s worth, your comment has already caused ten murders. I don’t have to offer evidence, just as you don’t. I don’t know where those murders happened, or who was murdered, but it was clearly the result of your comment. Why are you such a terrible person as to post something that causes murder?

            • @[email protected]
              link
              fedilink
              English
              22 years ago

              I have no problem saying that writing stories about two children having gay sex is pretty fucked in the head, along with anyone who forms a community around sharing and creating it.

              But it’s also not inherently abuse, nor is it indistinguishable from reality.

              You’re advocating that people just be cool with photo-realistic images of children, of any age, being raped, by any number of people, in any possible way, with no assurances that the images are genuinely “fake” or that pedophiles won’t be driven to make it a reality, despite other pedophiles cheering them on.

              I was a teenage contrarian psuedo-intellectual once upon a time too, but I never sold out other peoples children for something to jerk off too.

              If you want us to believe its harmless, prove it.

              • @[email protected]
                link
                fedilink
                English
                -12 years ago

                You keep making up weird, defamatory accusations. Please stop. This isn’t acceptable behavior here.

                • @[email protected]
                  link
                  fedilink
                  English
                  22 years ago

                  Awful pearl-clutchy for someone advocating for increased community support for photorealistic images of children being raped.

                  Which do you think is more acceptable to Lemmy in general? Someone saying “fuck”, or communities dedicated to photorealistic images of children being raped?

                  Maybe I’m not the one who should be changing their behavior.

    • DrNeurohax
      link
      fedilink
      302 years ago

      Well, terrorists became boring, and they still want the loony wing of the GOP’s clicks, so best to back off on Nazis and pro-Russians, leaving pedophiles as the safest bet.

        • DrNeurohax
          link
          fedilink
          32 years ago

          Agreed. I’m in my 40s, and I’ve never seen anywhere near the level of subsurface signaling and intentional complacency we’re experiencing now.

  • BarterClub
    link
    fedilink
    82 years ago

    This seems like a very normal thing with all social media. Now if the server isn’t banning and removing the content within a reasonable amount of time then we have major issues.

    Seems like if you talk about Mastodon but not Twitter or Facebook in the same post it makes it feel like one is greater than the others. This article seems half banked to get clicks.

  • 👁️👄👁️
    link
    fedilink
    35
    edit-2
    2 years ago

    Nothing you can do except go after server owners like usual. Has nothing to do with the fedi. Mastodon has nothing to do with either because anyone can pop up their own alternative server. This is one of many protocols they have or will use to distribute this stuff.

    This just in: criminals are using the TCP protocol to distribute CP!!! What can the internet do to stop this? Oh yeah, go after server owners and groups like usual.

    • @[email protected]
      link
      fedilink
      11
      edit-2
      2 years ago

      Things are a bit complicated in the fediverse. Sure, your instance might not host any pedo community, but if a user on your instance subscribe/interact with those community, the CSAMs might get federated into your instance without you noticing. There are tools to help you combat this, but as an instance owner you can’t just assume it’s not your problem if some other instance host pedo stuff.

      • 👁️👄👁️
        link
        fedilink
        62 years ago

        That is definitely alarming, and a downside of the fedi, but seems like a necessary evil. Unfortunately admins and mods of small communties in the fedi will be the ones exposed to this. There has been better methods if handling this though. There are shared block lists out there and they already have lists that block out undesirable stuff like that, so it at least minimizes the amount of innocent eyes of mods, who are just regular unpaid people, from seeing disgusting stuff. Also, obviously those instances should be reported to the police, fbi, or whatever the heck

    • @[email protected]
      link
      fedilink
      42 years ago

      There is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content

      Shadow banning those users would be nice too