Sorry for the short post, I’m not able to make it nice with full context at the moment, but I want to quickly get this announcement out to prevent confusion:

Unfortunately, people are uploading child sexual abuse images on some instances (apparently as a form of attack against Lemmy). I am taking some steps to prevent such content from making it onto lemm.ee servers. As one preventative measure, I am disabling all image uploads on lemm.ee until further notice - this is to ensure that lemm.ee can not be used as gateway to spread CSAM into the network.

It will not possible to upload any new avatars or banners while this limit is in effect.

I’m really sorry for the disruption, it’s a necessary trade-off for now until we figure out the way forward.

  • iByteABit [he/him]
    link
    fedilink
    202 years ago

    This is a very good decision, I worried about this problem from the very beginning that I learned about the Fediverse. Research must definitely be done to find CSAM detection tools that integrate into Lemmy, perhaps we could make a separate bridge repo that integrates a tool like that easily into the codebase.

    I hope every disgusting creature that uploads that shit gets locked up

  • Ulu-Mulu-no-die
    link
    fedilink
    English
    192 years ago

    That’s disgusting! You made the right thing, sorry you admins and mods have to put up with that shit, I hope instance owners that are being attacked are reporting it to local authorities.

    • @[email protected]
      link
      fedilink
      10
      edit-2
      2 years ago

      I don’t think they made it onto this server, with the 100kb upload limit in place, that was already a rather low risk. It’s a preventive measure. So far lemmy.world was the one deliberately targeted.

    • @[email protected]
      link
      fedilink
      252 years ago

      I’m going to go out on a limb and say they and all the other instances that were hit with this attack probably did. Which authorities, I don’t know. If this instance is hosted in Estonia then probably Estonian authorities, but it’s probably being hosted on the cloud so is it REALLY hosted in Estonia? There are a ton of American and EU users so hopefully the FBI and whatever the EU equivalent is. But honestly cybercrimes can get confusing because of the nature of people and hosting being spread out all over the world and it can be hard to even figure out who to report to.

      • @[email protected]
        link
        fedilink
        12 years ago

        Europol in Europe. But you can report it to your national cybercrime division and they can refer it to the appropriate authority if necessary.

  • sleepy
    link
    fedilink
    132 years ago

    Well, that sucks i wanted to share some cute pics i took of my cats

      • @[email protected]
        link
        fedilink
        22 years ago

        I wouldve recommended catbox.moe but I have FUD about it now. I assume they are have themselves together, but I cant know for shure.

        • @[email protected]
          link
          fedilink
          2
          edit-2
          2 years ago

          also ur fears are justified, but just checked: ghostery doesn’t show any trackers on catbox’s part, so its safe to use…for now…but one has to stay vigilant and make regular checks to see what will happen about them

        • @[email protected]
          link
          fedilink
          22 years ago

          well we could always use any other suggestions: imgur is a spyware in itself, but what can we do (i put it in the freezer app so i dont get wiretapped, suddenly my videos started to be uploaded in gif form by them to save on bandwidth lol 😭 )

  • Cris
    link
    fedilink
    482 years ago

    I know there are automated tools that exist for detection CSAM- given the challenges the fediverse has had with this issue it really feels like it’d be worthwhile for the folks developing platforms like lemmy and mastodon to start thinking about how to integrate those tools with their platforms to better support moderators and folks running instances.

    • Spaghetti_Hitchens
      link
      fedilink
      222 years ago

      I just shut down my instance because of this attack. Once there are better controls to prevent this, I will stand it back up.

      • Cris
        link
        fedilink
        102 years ago

        Yeah, there was a gardening instance run by a great guy who just did the same

      • @[email protected]
        link
        fedilink
        72 years ago

        What do you think the purpose of these attacks are? The fediverse is so small in the grand scheme that I can only assume the worst.

      • Xusontha
        link
        fedilink
        42 years ago

        Good thing my instance is only friends and friends of friends, otherwise I’d have to do the same

        What was your instance?

      • @[email protected]
        link
        fedilink
        1
        edit-2
        2 years ago

        Good, its an API that can fit diffrent tools even if one is promoted. Upgrading means switching out a binary file. Posix modularization FTW.

      • Cris
        link
        fedilink
        8
        edit-2
        2 years ago

        That’s fucking dope, thank you very much for the link to the issue!

  • Mindfury [he/him]
    link
    fedilink
    English
    482 years ago

    fucking disgusting, and I’m sorry you and your mods, admins and users were subjected to this

  • Awoo [she/her]
    link
    fedilink
    English
    252 years ago

    If you’re concerned about legal liability I think it’s worth noting that there is some protection for websites in this matter. For the most part as long as you’re taking “reasonable action” against it you’re not liable, and that most laws take into consideration the resources of the site dealing with the uploads.

    Not pleasant for users though of course. And the speed at which its handled is obviously a concern.

  • Queen HawlSera
    link
    fedilink
    English
    172 years ago

    This sucks, but given the circumstances it’s sadly an understandable and necessary course of action.

  • @[email protected]
    link
    fedilink
    22 years ago

    Better shut the internet down then. This will only continue to worsen now that anybody can generate whatever images they want with AI assistance. Such image hashes will not be in CSAM databases (if AI generated imagery is even CSAM)

  • RoomAndBored [he/him, any]
    link
    fedilink
    English
    362 years ago

    This is foul and I am extremely sorry for the users and mods who were sent the CSAM. It isn’t something they should expect to deal with in a voluntary role for their communities and it can be traumatic. I hope they are given time and space to process their emotions.