Hey folks!

I made a short post last night explaining why image uploads had been disabled. This was in the middle of the night for me, so I did not have time to go into a lot of detail, but I’m writing a more detailed post now to clear up where we are now and where we plan to go.

What’s the problem?

As shared by the lemmy.world team, over the past few days, some people have been spamming one of their communities on lemmy.world with CSAM images. Lemmy has been attacked in various ways before, but this is clearly on a whole new level of depravity, as it’s first and foremost an attack on actual victims of child abuse, in addition to being an attack on the users and admins on Lemmy.

What’s the solution?

I am putting together a plan, both for the short term and for the longer term, to combat and prevent such content from ever reaching lemm.ee servers.

For the immediate future, I am taking the following steps:

1) Image uploads are completely disabled for all users

This is a drastic measure, and I am aware that it’s the opposite of many of our users have been hoping, but at the moment, we simply don’t have the necessary tools to safely handle uploaded images.

2) All images which have federated in from other instances will be deleted from our servers, without any exception

At this point, we have millions of such images, and I am planning to just indiscriminately purge all of them. Posts from other instances will not be broken after the deletion, the deleted images will simply be loaded directly from other instances.

3) I will apply a small patch to the Lemmy backend running on lemm.ee to prevent images from other instances from being downloaded to our servers

Lemmy has always loaded some images directly from other servers, while saving other images locally to serve directly. I am eliminating the second option for the time being, forcing all images uploaded on external instances to always be loaded from those servers. This will somewhat increase the amount of servers which users will fetch images from when opening lemm.ee, which certainly has downsides, but I believe this is preferable to opening up our servers to potentially illegal content.

For the longer term, I have some further ideas:

4) Invite-based registrations

For a long time, I have believed that one of the best ways to effectively combat spam and malicious users is to implement an invite system on Lemmy. I have wanted to work on such a system ever since I first set up this instance, but real life and other things have been getting in the way, so I haven’t had a chance. However, with the current situation, I believe this feature is more important then ever, and I’m very hopeful I will be able to make time to work on it very soon.

My idea would be to grant our users a few invites, which would replenish every month if used. An invite will be required to sign up on lemm.ee after that point. The system will keep track of the invite hierarchy, and in extreme cases (such as spambot sign-ups), inviters may be held responsible for rule breaking users they have invited.

While this will certainly create a barrier of entry to signing up on lemm.ee, we are already one of the biggest instances, and I think at this point, such a barrier will do more good than harm.

5) Account requirements for specific activities

This is something that many admins and mods have been discussing for a while now, and I believe it would also be an important feature for lemm.ee. Essentially, I would like to limit certain activities to users which meet specific requirements (maybe account age, amount of comments, etc). These activities might include things like image uploads, community creation, perhaps even private messages,

6) Automated ML based NSFW scanning for all uploaded images

I think it makes sense to apply automatic scanning on all images before we save them on our servers, and if it’s flagged as NSFW, then we simply don’t accept the upload. While machine learning is not 100% accurate and will produce false positives, I believe this is a trade-off that we simply need to accept at this point. Not only will this help against any potential CSAM, it will also help us better enforce our “no pornography” rule.

This would potentially also allow us to resume caching images from other instances, which will improve both performance and privacy on lemm.ee.


With all of the above in place, I believe we will be able to re-enable image uploads with a much higher degree of safety. Of course, most of these ideas come with some significant downsides, but please keep in mind that users posting CSAM present an existential threat to Lemmy (in addition to just being absolutely morally disgusting and actively harmful to the victims of the abuse). If the choice is between having a Lemmy instance with some restrictions, or not having a Lemmy instance at all, then I think the restrictions are the better option.

I also would appreciate your patience in this matter, as all of the long term plans require additional development, and while this is currently a high priority issue for all Lemmy admins, we are all still volunteers and do not have the freedom to dedicate huge amounts of hours to working on new features.


As always, your feedback and thoughts are appreciated, so please feel free to leave a comment if you disagree with any of the plans or if you have any suggestions on how to improve them.

  • Draconic NEO
    link
    fedilink
    English
    4
    edit-2
    2 years ago

    I think the images should never be cached from other instances in the first place, that is a huge oversight in pictrs since not only does it have the potential to cache unwanted content but also causes the images hosted to rapidly accumulate which isn’t ideal as it increases storage requirements which is unfair to people who want to self-host a personal instance. Hosting a personal instance should not have monstrous storage requirements or serious liability risk due to caching all images automatically, it should only cache what is uploaded to the Instance like profiles and banners, and posts that include images from the Instance.


    I have reservations about allowing fully-invite based registrations on lemmy instances. While I do think it might be good to have invites as a way for users to skip filling out an application I don’t really like the idea of requiring them like Tildes does, makes it feel like an elitist exclusive club of sorts having to beg for an invite from users. I don’t think it should be an alternative to application-based registration, but rather a supplement to it, if someone can get an invite from users that’s great but if not they should still be able to write an application to join, this could be extensive and also lower priority since you could get invites but should still be an option available.


    Account requirements really depends on what they are and what they restrict (also who on the instance is allowed to impose restrictions). For example on instances with downvotes enabled I think score/upvote requirements are a bad idea since it essentially means that people who disagree are locked out, like on Reddit with karma restrictions, I do not support this, it creates an echo-chamber where unpopular opinions. It’ll also lead to upvote farming if there are negatives due to having a lower score.

    Comment or post requirements would just lead to post or comment farming similar to vote farming, though it’s not as bad as score-requirements since people making posts and comments naturally (whether they are liked or not) can’t be taken away by other people based on opinions (only if they break the rules and get posts removed, which isn’t even remotely similar since they broke the rules).

    Limiting image uploading is a fair requirement in my opinion since uploads can be particularly harmful if the uploads are malicious, and also uploads aren’t really needed since people can externally host almost all their images without the need for uploads.

    When it comes to DMs and restrictions around them I feel like that should be up to individual users to decide to allow private communication from certain users or not, or even to allow DMs at all, this shouldn’t be something globally applied to people, maybe it could be a default in User settings and have a requirement set by the Admins but people should be able to turn it off if they don’t care or want to accept messages from new users, I know I certainly will, I hate being nannied when it comes to who’s allowed to send me messages, IMO Annoying or uncomfortable DMs are a fact of life and I prefer to deal with issues when they happen rather than block anyone who’s a new user that might want to talk to me, it’s one of the things I hated that Reddit does without giving me the option to opt out and receive messages from everyone.


    I think having a Machine-Learning based system to identify Malicious images is actually a pretty good idea going forward, I know how some people feel about AI and Machine-Learning but I think it’s probably our best defense considering that none of us want to see it, it might have False positives but I’d rather than than to allow CSAM to live here. Ultimately the choice is have ML scanning or Disable pictrs here, I think ML is the better option because people are going to want to have Avatars and without pictrs that isn’t possible (unless Lemmy adds support to the UI for externally hosted Avatars and Banners).


  • Xusontha
    link
    fedilink
    92 years ago

    I will apply a small patch to the Lemmy backend running on lemm.ee to prevent images from other instances from being downloaded to our servers

    If possible, could you tell others how to apply this patch to their own server?

  • @[email protected]
    link
    fedilink
    172 years ago

    This is something that many admins and mods have been discussing for a while now, and I believe it would be an important feature for lemm.ee as well. Essentially, I would like to limit certain activities to users which meet specific requirements (maybe account age, amount of comments, etc). These activities might include things like image uploads, community creation, perhaps even private messages.

    Sounds like the old karma requirements some reddit subs had. While I’m not against that, it would restrict locally registered users more so than others who are posting on lemm.ee communities when their host instance has no such system in place. I’m aware that if they post images those would be uploaded to their home instance and linked here with the patch you mentioned above, but the downside is that local users might feel inconvenienced more so than others. Not saying it’s a bad idea though, if we are thinking from a “protect lemm.ee” angle first and foremost.

    Automated ML based NSFW scanning for all uploaded images

    You might want to reach out to the dev of Sync for Lemmy, ljdawson on [email protected], he just implemented an anti-NSFW upload feature in the app to do his part. Essentially, Sync users currently can’t post any kind of porn. While I don’t think that the CP spammers were using his particular app, or any app to begin with, I do think it’s a neat feature to have, but would make much more sense to run server-side.

    • @[email protected]
      link
      fedilink
      62 years ago

      he just implemented an anti-NSFW upload feature in the app to do his part. Essentially, Sync users currently can’t post any kind of porn

      but what about normal, legal, NSFW material?

      • @[email protected]
        link
        fedilink
        12 years ago

        Not allowed on lemm.ee in the first place. Well, you can see NSFW posts and subscribe to everything on lemmynsfw.com but you’re not supposed to post any porn from a lemm.ee account.

        Policing NSFW is a whole can of worms, it makes sense to leave it to specialised instances. They can nuke political drama from orbit, we can nuke nudity from orbit, both saving mod bandwidth to do the other thing right.

  • @[email protected]
    link
    fedilink
    172 years ago

    Could you post a guide on disabling the local image cache? I compile from scratch so I’m not afraid of making changes in the code, I just don’t really know rust. I shut down my personal instance and this would allow me to turn it back on.

  • @[email protected]
    link
    fedilink
    52 years ago

    I wonder how hard it would be to fund a full time staff to review content. That’s how other platforms do it.

    • @[email protected]
      link
      fedilink
      72 years ago

      Other platforms also use armies of unpaid volunteers to do it. There are various methods, and with this being an entirely volunteer run an financed platform I really doubt if they is feasible. In the long term I like the idea of using technology to improve detection and moderation even if that requires some development commitment.

  • bdesk
    link
    fedilink
    822 years ago

    You forgot getting the authorities involved when somebody does upload csam

    • @[email protected]OP
      link
      fedilink
      132 years ago

      The Lemmy.world team is getting some authorities involved already for this particular case. I am definitely in favor of notifying law enforcement or revelant organizations, and if anybody tries to use lemm.ee to spread such things, I will definitely be involving my local authorities as well.

    • TWeaK
      link
      fedilink
      English
      202 years ago

      getting the authorities involved

      How do you imagine that playing out? This isn’t some paedophile ring trading openly, this is people using CSAM as an attack vector. Getting over-enthusiastic police involved is exactly their goal, and will likely do very little to help the victims in the CSAM itself.

      Yes, authorities should be notified and the material provided to the relevant agencies for examination. However that isn’t truly the focus of what’s happening here. There is no immediate threat to children with this attack.

      • @[email protected]
        link
        fedilink
        422 years ago

        How do you imagine that playing out?

        FBI: Whoa that illegal

        Admin: Ya

        FBI: We’re going to look for this guy

        Admin: alright

        END ACT 1

        • TWeaK
          link
          fedilink
          English
          172 years ago

          This isn’t something the FBI have much involvement with. The FBI deal with matters across states.

          This isn’t America, where you have a bunch of separate states unified under one American government. People haven’t been posting porn to lemm.ee. People have been posting porn to other instances, which has seeped through to lemm.ee.

          Getting the Estonian law enforcement involved is like trying to get the Californian government involved in dealing with a problem from Texas. Estonian law enforcement have no jurisdiction over lemmy.world or any other instance, and giving them an opportunity is only going to lead to locking down lawful association and communication in favour of some vague “think of the children” rhetoric. And, like I say, it won’t do anything to curtail the production of CSAM as the purpose of this attack has little to do with the promotion of CSAM.

          Frankly, it could easily be more like:

          lemm.ee: We’ve got a problem with illegal content

          Estonian law enforcement: Woah that’s illegal.

          Estonian law enforcement: You’ve admitted to hosting illegal content. We’re going to confiscate all your stuff.

          lemm.ee is shut down pending investigation.

          Meanwhile, if lemm.ee continues its current course of action, yet someone notifies law enforcement:

          Estonian law enforcement: Woah, we’ve got a report of something dodgy, that’s illegal.

          lemm.ee: People tried to post illegal content elsewhere that could have come to our site, we blocked and deleted it to the best of our ability.

          Estonian law enforcement: Fair enough, we’ll see what we can figure out.

          It really matters how and when the problem is presented to law enforcement. If you report yourself, they’re much more likely to take action against yourself than if someone else reports you. It doesn’t do yourself any favours to present your transgressions to them, not unless you’re absolutely certain you’re squeeky clean.

          At this stage and in these circumstances, corrective action is more important than reporting.

          • @[email protected]
            link
            fedilink
            52 years ago

            You’re assuming that no American user saw any of the content. I think the FBI could absolutely get involved if the content was seen by anyone in the US, let alone by people in more than 1 state. I’m not going to pretend to be an expert on child abuse or cyber crimes but the FBI devotes massive resources to investigation of crimes against children and could potentially at least help other agencies investigate where this attack originated from. And if the FBI were able to determine that the attack originated from the US, I assure you the DOJ is far less kind to people who possess, commit or distribute that type of horrible child abuse than they are to rich old white men who commit a coup. You’re kind of acting like this is just another DDOS attack rather than the deliberate distribution of horrific images of child abuse to a platform that in no way encourages distribution of child abuse material.

            Anywhooooo the problem was much worse on lemmy.world since they were the main target of the attack. Does anyone know if they reported it?

            • @[email protected]
              link
              fedilink
              1
              edit-2
              2 years ago

              Local authorities will be the contact point of the admins (or authorities of where the servers are hosted). They’ll investigate what they can and then ring up euro/inter/whatever pol as necessary to have other forces handle stuff in their respective jurisdictions. Cross-border law enforcement isn’t exactly unchartered waters, they’ve been doing it for quite a while.

              As to the current case the ball is clearly in the field of lemmy.world admins and their local authorities (Germany? Hetzner, I think, as so many) as they’re the ones with the IP logs. Even if the FBI gets a tip-off because an American saw anything they’re not exactly in a position to do anything but go via Interpol and ask the BKA if they’d like to share those IP logs.

    • @[email protected]
      link
      fedilink
      312 years ago

      It’s a known tactic by trolls to upload cheese pizza and then notify the media/the authorities themselves because context doesn’t matter when it comes to CSAM

  • zeus ⁧ ⁧ ∽↯∼
    link
    fedilink
    English
    27
    edit-2
    2 years ago

    thank you for your work sunaurus, and i’m sorry you had to sort through this

    (particularly annoying though, as i never got around to adding a user banner; and i had one in mind as well. i wish there was some way to externally host avatars and banners)

  • @[email protected]
    link
    fedilink
    English
    92 years ago

    Seems like a good plan. I have been very impressed with your approach to administer ing lemm.ee.

    Regarding the planned invite system, what would be the consequences of inviting a malicious user? I would think it would be hard to enforce any consequences simply because of the open nature of lemmy as an ecosystem.

  • @[email protected]
    link
    fedilink
    182 years ago

    This has been a great instance since day one, and it’s good to see you once again being so proactive. Thank you for the update!

    There are downsides with all kinds of moderation, but ultimately most of us accept that the internet can’t function as a true free-for-all. Absolutely in support of whatever you feel is necessary to keep the server safe, but please watch out for yourself too and make sure you’re asking for help where needed.

    p.s. anyone reading this who doesn’t donate to the server yet, here’s a reminder that that’s a thing you can do.

  • TWeaK
    link
    fedilink
    English
    52 years ago
    1. All images which have federated in from other instances will be deleted from our servers, without any exception

    At this point, we have millions of such images, and I am planning to just indiscriminately purge all of them. Posts from other instances will not be broken after the deletion, the deleted images will simply be loaded directly from other instances.

    My impression was that this was how this worked from the beginning, but apparently that’s wrong. I thought the host instance (that is, the instance of the user making the post, not necessarily the instance of the community) would be the host of the image. Instead, it seems like instances share images and whatnot between themselves, to distribute the load to their own users.

    Maybe this core principle is flawed. It should definitely be reviewed, anyway.

  • @[email protected]
    link
    fedilink
    242 years ago

    I’m going to be a part of an invite only community?! Of course, given the circumstances, this is pretty fucked. But I feel kinda fancy right now.

    Thanks for all you do on lemm.ee

  • edric
    link
    fedilink
    62 years ago

    I didn’t even know there was an option to load images directly from the source instance instead of caching the content locally. I know it’s a resource issue and it can slow things down a bit for users, but I think ultimately it should be done that way by default, to mitigate exponential propagation of illegal content. Wasn’t caching the main reason why lemmy.world preemptively blocked piracy communities?

    That, or admins should be able to selectively choose what communities to cache content from, like maybe the ones where they can confirm there is active moderation.

    • @[email protected]
      link
      fedilink
      5
      edit-2
      2 years ago

      Privacy-minded users want caching because otherwise it means they’re connecting to multiple (possibly malicious) websites instead of just lemm.ee (someone made a post that would grab your IP and show it to you, for example). It’s difficult.

      • edric
        link
        fedilink
        52 years ago

        Good point. I was imagining users grabbing content from the source instance via their local instance as a proxy, which would hide their info. Obviously I don’t know how the backend works, so if the alternative is direct connectivity exposing your info, then yeah that’s definitely something to think about.

  • @[email protected]
    link
    fedilink
    English
    532 years ago

    IMO Lemmy shouldn’t have media uploading of any kind. Aside from the CSAM risk, it’s unsustainable and I think one of the reasons Reddit went to shit is by getting into the whole image/video/gif hosting.

    Dozens of media hosts exist out there, and the mobile/web clients should focus instead on showing remote content better.

    • Billygoat
      link
      fedilink
      English
      372 years ago

      The flip side of the argument is that if you also host the media you are not at risk of having broken links. I’ve seen a number of long running forums that had post bodies that contained external images that are now broken.

      Of course an argument can be made that the only reason that those forums have lived for so long was due to not having costs associated with hosting media.

      • TWeaK
        link
        fedilink
        English
        112 years ago

        That’s no worse than a reddit link getting borked because it’s been cross-posted and someone managed to kill the original link with a DMCA notice.

        • DMmeYourNudes
          link
          fedilink
          English
          52 years ago

          A post getting removed because someone threatened legal action is not the same as using an image host that goes under because no one visits their site to see their ads to pay for hosting it or because they arbitrarily purged their content or changed their link format like imgur has. Unless Lemmy hosts it’s own images it will be at risk of being purged like has happened many times over.

        • Billygoat
          link
          fedilink
          English
          7
          edit-2
          2 years ago

          I would say that is a different issue. DMCA could go to whatever external host as well so that doesn’t change.

          My argument was about putting faith in external providers to stay alive to continue hosting media. You can also get in a situation where an external provider decides to do a mass delete like what Imgur did this past summer.

    • @[email protected]
      link
      fedilink
      22 years ago

      I get we don’t trust these third party image hosting sites, but if it’s that or having local images that can potentially bring down instances, I’d say that’s a no brainier of a compromise.

      These upload sites like imgur automatically handle image detection and take the load off smaller servers. It seems like a perfect solution got now

  • Nix
    link
    fedilink
    82 years ago

    These are great ideas especially the ability for users to invite others. I think it’s also a good way to get new people into the fediverse since inviting someone will have them easily know what instance to go to.

    Will you submit all these features to the official lemmy backend too?

  • PatFusty
    link
    fedilink
    5
    edit-2
    2 years ago

    Out of curiosity, how were the CSAM images discovered? Whats stopping anyone from creating their own xyz instance and post where nobody can see?

    • @[email protected]
      link
      fedilink
      42 years ago

      There probably is at least one instance that isn’t federated that’s dedicated to csam. These were uploaded to lemmy.world as an attack, and because of the way federation works some of the content is then downloaded onto the servers of other instances when it’s viewed or interacted with from a federated instance.

    • GarbageShoot [he/him]
      link
      fedilink
      English
      52 years ago

      I think a perpetrator who makes there own instance (and thereby has their own website, etc.) is going to be much more likely to meet actual legal reprisal, because they had to buy the domain somehow and have servers somewhere.

      • PatFusty
        link
        fedilink
        32 years ago

        This is why i made the question. Whats the point of posting CSAM on some random shitposting instance if not to jeapordize the whole instance?

        I just looked up the international CSAM laws and apparently there are still some countries on this green earth that allow its use. I think as long as countries like Russia or Cambodia dont punish its distribution, then it will make its way to a lemmy instance.

    • @[email protected]
      link
      fedilink
      42 years ago

      By anyone browsing new that would see lemmy.world’s shitpostint community, but anyone who made their own instance (assuming not federated yet) would essentially be in their own dark web, so there’s no way for anyone else to see/cache such images