cross-posted from: https://jamie.moe/post/113630

There have been users spamming CSAM content in [email protected] causing it to federate to other instances. If your instance is subscribed to this community, you should take action to rectify it immediately. I recommend performing a hard delete via command line on the server.

I deleted every image from the past 24 hours personally, using the following command: sudo find /srv/lemmy/example.com/volumes/pictrs/files -type f -ctime -1 -exec shred {} \;

Note: Your local jurisdiction may impose a duty to report or other obligations. Check with these, but always prioritize ensuring that the content does not continue to be served.

Update

Apparently the Lemmy Shitpost community is shut down as of now.

  • @[email protected]
    link
    fedilink
    English
    692 years ago

    I literally am going to give up social media in general if this doesn’t stop

    Seen it last night late around 3am shit made me sick I honestly almost cried but I just closed the app and tried not to think about it

    Whatever the goal is it’s a stark reminder that there is monsters creeping in the shadows every where you go

      • @[email protected]
        link
        fedilink
        English
        172 years ago

        I’d think most trolls would pass up on doing something that can easily get them imprisoned if ever found out …

      • regalia
        link
        fedilink
        English
        702 years ago

        This isn’t trolling, this is just disgusting crime.

        • @[email protected]
          link
          fedilink
          English
          1
          edit-2
          2 years ago

          The crime happened in the past when the children were abused. This is some weird amalgam of criminal trolling.

          Edit: yeah yeah I get that csam is criminal, that’s why I called it an amalgam. It’s both trolling and criminal.

          • chiisana
            link
            fedilink
            English
            112 years ago

            Depending on jurisdiction, I am not a lawyer, etc etc, but I’d imagine with fairly high degree of probability that re-distribution of CSAM is also a crime.

          • Dark Arc
            link
            fedilink
            English
            182 years ago

            It’s still a crime. Taking the pictures is a crime. Sharing the pictures is also a crime.

          • @[email protected]
            link
            fedilink
            English
            22 years ago

            The crime happened in the past when the children were abused.

            That’s true. You could look at it that way and stop right there and remain absolutely correct. Or, you could also look at it from the eventual viewpoint of that victim as a human being: as long as that picture exists, they are being victimized by every new use of it, even if the act itself was done decades ago.

            Not trying to pile on, but anyone who has suffered that kind of violation as a child suffers for life to some extent. There are many who kill themselves, and even more that cannot escape addiction because the addiction is the only safe mental haven they have where life itself is bearable. Even more have PTSD and other mental difficulties that are beyond understanding for those who have not had their childhood development shattered by that, or worse, had that kind of abuse be a regular occurrence for them growing up.

            So to me, adding a visual record of that original violating act to the public domain that anyone can find and use for sick pleasure is an extension of the original violation and not very different from it, IMO.

            The visual records are kind of a sick gift that never stop giving, and worse still if the victim knows the pics or videos are out there somewhere.

            I am well aware not everyone sees it this way, but an extra bit of understanding for the victims would not go amiss. Imagine being an adult and browsing the web, thinking it’s all in the past and maybe you’re safe now, and stumbling across a picture of yourself being raped at the age of five, or whatever, or worse still, having friends or family or spouse or children stumble across it.

            So speaking only for myself, I think CSAM is a moral crime whenever it is accessed, one of the most hellish that can be committed against another human being, regardless of the specificities of the law.

            I don’t have a problem with much else that people share, but goddamn I do have a problem with that.

  • @[email protected]
    link
    fedilink
    English
    112 years ago

    There was a weird JSON error I was getting in the last few minutes. I’m not sure if this is at all related.

  • @[email protected]
    link
    fedilink
    English
    682 years ago

    big F in chat for those of you dealing with this. my #1 fear about setting upand instance.

    • @[email protected]
      link
      fedilink
      English
      422 years ago

      It impacts everyone when this shit happens. It takes time for mods/admins to take down. And you can’t unsee it.

      I hope nobody else has the misfortune of stumbling on that shit

      • @[email protected]
        link
        fedilink
        English
        52 years ago

        Yeah you really can’t. I’m pretty desensitized from earlier internet with death and other shock gore content but had managed to avoid CSAM until today. It was a lot worse than I expected, felt my heart drop. Worse, my app autoplays gifs in thumbnail so it kept going while I was reporting it.

        I’ve mostly forgotten and it wasn’t on my mind until I saw this thread (happened less than 24hr ago) but even the slightest reminder is oddly upsetting. Wish I’d thought of the Tetris thing.

      • Bleeping Lobster
        link
        fedilink
        English
        412 years ago

        There have been studies which found playing tetris for an hour or two after seeing something traumatic can prevent it taking root in our longterm memory.

        I tried it once after accidentally clicking a link on reddit that turned out to be gore, I can’t remember exactly what it was now (about 9 months later) so it must have worked

  • Neuromancer
    link
    fedilink
    English
    142 years ago

    If the source deletes the post. Won’t that remove it from all the instances ?

    • @[email protected]
      link
      fedilink
      English
      82 years ago

      You’ll need to find where the actual container files are being stored. I’m unfortunately not familiar with Lemmy Easy Deploy, but you should have a folder that has some files/folders like docker-compose.yml, volumes, lemmy.hjson.

      The important one is the volumes/pictrs/files folder, take the full path of that folder and replace it with the /srv/lemmy/example.com... path from the original post, and then that command should work.

  • ugjka
    link
    fedilink
    English
    212 years ago

    blocked lemmyshitpost some time age because it is trash anyway

    • Hangry
      link
      fedilink
      English
      3
      edit-2
      2 years ago

      Just google it.

      Edit: Not Safe For Life

    • @[email protected]
      link
      fedilink
      English
      40
      edit-2
      2 years ago

      Child sexual abuse material - underage porn. For obvious reasons, you don’t want this to be something you’re hosting automatically out of your basement server.

        • TheRealKuni
          link
          fedilink
          English
          122 years ago

          I’ve been listing to the audiobook for American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer and the number of times they say “CP” as an abbreviation for “Communist Party” is too damn high.

          Also last time I went to the amusement park Cedar Point they’ve got “CP” as an abbreviation on all sorts of stuff.

          Made me chuckle, but I do think it’s perhaps time to move to the abbreviation CSAM since it’s less likely to get used for other purposes.

          • @[email protected]
            link
            fedilink
            English
            152 years ago

            There’s a lot of porn that wasn’t made consensually either. I don’t care what we refer to csam as but I think it’s important to acknowledge that.

          • @[email protected]
            link
            fedilink
            English
            262 years ago

            In what world anyone would think that CP implies consent? I mean, the word ‘child’ is right there. Do you think that the term ‘child soldiers’ implies consent? I don’t have anything against the term CSAM but if it was created because of doubts around consent it was a silly reason to create it.

            • @[email protected]
              link
              fedilink
              English
              112 years ago

              The term originates from professionals - psychiatrists etc - who work in that field, because they knew even decades ago that “pron” is the wrong word for this kind of material.

              • @[email protected]
                link
                fedilink
                English
                12 years ago

                I think it’s more likely some people working in those fields wanted to improve their career by popularizing a new term.

            • @[email protected]
              link
              fedilink
              English
              52 years ago

              I think it has less to do with the existence of non-consensual porn as with the possibility and, indeed, existence of vast amounts of consensual porn. Consent is very much possible in adult porn, it isn’t with CSAM. It’s also possible with soldiers, though of course conscription exists and ask a random Ukrainian they’d rather not have to be a soldier for their loved ones to be protected.

  • @[email protected]
    link
    fedilink
    English
    182 years ago

    I was looking into self hosting. What can I do to avoid dealing with this? Can I not cache images? Would I get in legal trouble for being federated with an instance being spammed?

  • @[email protected]
    link
    fedilink
    English
    192 years ago

    I went ahead and just deleted my entire pictrs cache and will definitely disable caching other servers images when it becomes available.

  • @[email protected]
    link
    fedilink
    English
    652 years ago

    Likely scum moves from reddit patriots to destroy or weaken the fediverse.

    I remember when Murdoch hired that Israeli tech company in Haifa to find weaknesses is TV smart cards and then leaked it to destroy their market by flooding counterfit smart cards.

    They are getting desperate along with those DDOS attacks.

    • OrbitJunkie
      link
      fedilink
      English
      272 years ago

      Could be, but more likely it’s just the result of having self hosted services, you have individuals exposing their own small servers to the wilderness of internet.

      These trols also try constantly to post their crap to mainstream social media but they have it more difficult there. My guess is that they noticed lemmy is getting a big traction and has very poor media content control. Easy target.

      Moderating media content is a difficult task and for sure centralized social media have better filters and actual humans in place to review content. Sadly, only big tech companies can pay for such infrastructure to moderate media content.

      I don’t see an easy way for federated servers to cope with this.

      • @[email protected]
        link
        fedilink
        English
        122 years ago

        Yeah exactly. This is the main reason I decided not to attempt to self host a Lemmy instance. No way am I going to let anyone outside of my control have the ability to place a file of their choosing on my hardware. Big nope for me.

  • Catasaur
    link
    fedilink
    English
    25
    edit-2
    2 years ago

    Self hoster here, im nuking all of pictrs. People are sick. Luckily I did not see anything, however I was subscribed to the community.

    • Did a shred on my entire pictrs volume (all images ever):

    sudo find /srv/lemmy/example.com/volumes/pictrs -type f -exec shred {} \;

    • Removed the pictrs config in lemmy.hjson

    • removed pictrs container from docker compose

    Anything else I should to protect my instance, besides shutting down completely?

      • 𝒍𝒆𝒎𝒂𝒏𝒏
        link
        fedilink
        English
        12 years ago

        You don’t, those are the collateral damage.

        IMO it’s better to just nuke every image from the last 24 hours than to subject yourself to that kind of heinous, disgusting content

  • owiseedoubleyou
    link
    fedilink
    English
    35
    edit-2
    2 years ago

    How desperate to destroy Lemmy must you be to spam CSAM on communities and potentially get innocent people into trouble?

    • @[email protected]
      link
      fedilink
      English
      132 years ago

      Maybe you’re a dev on the Reddit team and own a lot of shares for what you know is about to go public?