• 𝕯𝖎𝖕𝖘𝖍𝖎𝖙
    link
    fedilink
    152 years ago

    And fuck all of the lemmy users who demand that instance owners take on the risk that the users demand.

    No, unpaid volunteers do not want jailtime for the child porn you wish to post.

  • t�m
    link
    fedilink
    332 years ago

    Here I thought I could create a server and then use that as a instance only to hold my profile where I could then use that to interact across the fediverse

    • @[email protected]
      link
      fedilink
      10
      edit-2
      2 years ago

      Yeah I have wanted that from day one. I want it to work like mail my identity on my domain that I can bring anywhere, store my comments, posts, subscriptions and that’s it, maybe direct messages or explicitly saved posts. Not every damn post that I read / subscribe.

      • t�m
        link
        fedilink
        192 years ago

        But the federation issue with CSAM… I don’t want those issues.

          • @[email protected]
            link
            fedilink
            English
            112 years ago

            Yes, but people can still browse content from your instance without logging in. There is nothing stopping people from viewing illegal material through your instance.

            • @[email protected]
              link
              fedilink
              12 years ago

              Section 230 makes this but an issue. It would be like suing the phone company. Especially if you don’t moderate. If you moderate then it can be said what is left had your endorsement. If you don’t moderate, then you are simply a victim of vandalism.

                • @[email protected]
                  link
                  fedilink
                  32 years ago

                  Free countries have an equivalent, or else free online discussion space would be impossible. The section 230 compromise is inevitable or the internet would perish.

          • t�m
            link
            fedilink
            12 years ago

            Ok, I think I might be misunderstanding the issue; so it’s more of bad actors rather than a copy of images in cache?

  • @[email protected]
    link
    fedilink
    162 years ago

    What does CSAM exactly mean? (I understand the point of the meme completely just never heard of such abbreviation.

    • @[email protected]
      link
      fedilink
      282 years ago

      It’s child porn, and as horrifying as you’ve been told it is. Some scumbag trolls were posting it on lemmy.world’s memes sub and so .world finally decided to close open signups.

        • @[email protected]
          link
          fedilink
          512 years ago

          a) “CP” is a very online phrase that I imagine hasn’t permeated popular culture for good reason

          b) calling it “pornography” is tacitly implying it is arousing and/or serves a purpose, by calling it “abuse material” you remove any positive connotations

          • @[email protected]
            link
            fedilink
            English
            23
            edit-2
            2 years ago

            I think being more specific is also a good thing. Two letter acronyms are too broad. As CSAM, it’s unambiguous what it refers to. But CP means many things. Eg, in software dev, it’s often used for “control plane”. Some video games (eg, Pokemon Go) use it for “combat power”. I think ESO used it as “champion points” (though might have been a different MMO).

        • @[email protected]
          link
          fedilink
          232 years ago

          Porn is a legal thing that normal people enjoy. The term CSAM takes a stance that it is always abuse. I think they are basically interchangeable but CSAM is the currently preferred term.

            • @[email protected]
              link
              fedilink
              102 years ago

              Sometimes, terms need changing to separate it from something else. Porn in itself is legal and fine. When adding children in the mix it’s easy to get caught up in the porn part of the discussion rather than the child part.

              Separating the terms puts the focus more on the child abuse part.

        • @[email protected]
          link
          fedilink
          112 years ago

          That’s absurd. People aren’t stupid. We’re capable of understanding context and playing semantical games with something so serious is quite honestly pretty offensive.

        • @[email protected]
          link
          fedilink
          102 years ago

          What would you say is the difference? I feel like the terms are interchangeable. The comment you replied to didn’t give the exact abbreviation but it gave the essence of what is meant by the picture.

          • 𝒍𝒆𝒎𝒂𝒏𝒏
            link
            fedilink
            112 years ago

            By OC trying to imply a difference, one could be led to assume that OC believes there is some part of the illegal material that they do not consider abuse.

            I vehemently disagree with that line of thinking. It is abuse, and that is why it is illegal.

            • @[email protected]
              link
              fedilink
              5
              edit-2
              2 years ago

              Who in their right mind thinks calling CP is validating it in any way? Just because some morons decided to make a politically correct term for it doesn’t change what it is.

              • @[email protected]
                link
                fedilink
                62 years ago

                The politically correct term CSAM is to differentiate it from ordinary porn which is legal and at least somewhat socially acceptable and ensure that people understand that when children are involved, it is always abusive. The terms mean the same thing, but being precise with language is important.

    • gabe [he/him]
      link
      fedilink
      412 years ago

      Please clear your browser cache and take care of yourself in whatever you need to. I am so sorry you had to see it.

      • @[email protected]
        link
        fedilink
        English
        112 years ago

        Thanks. Can’t unsee it. It was in my app (Memmy), but I should probably clear that cache.

        • Natanael
          link
          fedilink
          42 years ago

          On Android you can long press an app icon to get to app properties (from launcher or app switcher) and look for storage and then wipe cache

      • @[email protected]
        link
        fedilink
        102 years ago

        And it’s easy. Society spends so much time and effort making life easier via improvements like simple image uploading and sharing, so of course some piece of shit will use it for this. Just a few clicks and they’ve created headaches for thousands of people. It requires no ability so the barrier of entry is as low as being the kind of trash that likes that stuff.

  • Tash
    link
    fedilink
    612 years ago

    I would love to have the EFF chime in, but there are some protections for you as a host under the Online Copyright Infringement Liability Limitation Act (OCILLA) - or safe harbor provision in the USA.

    As to how that has been tested legally on federated content, I don’t know. Perhaps another elder of the internet can tell me how Usenet servers handle it.

    • gabe [he/him]
      link
      fedilink
      292 years ago

      You are right, there is safe harbor protections here. It’s a legal mess that must be navigated carefully. We will see how things progress.

      • @[email protected]
        link
        fedilink
        English
        182 years ago

        While correct, you still may end up having to deal with the law about it. The whole “you can’t beat the ride” thing. Could be a ton of hassle and legal fees.

        • Tash
          link
          fedilink
          132 years ago

          What are you implying here? That @gabe should never have bothered with running a server? What about the server you are connected to right now? Should they shut down because of what may travel across it?

          No.

          They’re protected under the same rules as somebody running a WiFi hotspot at a coffee shop. As long as they are doing everything within reason to be a good steward of their local network (which is what Gabe is doing) then they are protected.

          • wagesj45
            link
            fedilink
            202 years ago

            Doesn’t seem like he was implying anything. Just stating the fact that part of the burden of citizenship is sometimes having to interact with law enforcement, maybe even go to trial, even if you’ve done nothing wrong.

          • Natanael
            link
            fedilink
            12 years ago

            FYI not all jurisdictions deal with website hosting (storage and distribution) as equivalent to hotspot/ internet services (dumb relay)

          • @[email protected]
            link
            fedilink
            English
            92 years ago

            I’m not suggesting anyone should or shouldn’t do anything, nor that I’m not grateful for people that do. Just saying it’s a potential downside that people should seriously consider before hosting any public access systems.

            They’re protected under the same rules as somebody running a WiFi hotspot at a coffee shop. As long as they are doing everything within reason to be a good steward of their local network (which is what Gabe is doing) then they are protected

            Hopefully, yeah. But again, there’s still this potential of the coffee shop of having all their equipment seized and having to deal with a law enforcement investigation and maybe even the courts. Even if the risk of actual jail time and monetary penalties is low, it’s something people should consider before doing it.

            This is one of the reasons I’m not running a public access network or TOR exit node at home even if I think those are worthwhile things to do.

  • Andrew
    link
    fedilink
    272 years ago

    I thought it says/should say SCAM. Boy was I wrong…

  • Bipta
    link
    fedilink
    1222 years ago

    You can look into Cloudflare’s CSAM setting, but I’m not exactly sure what it does.

    I don’t understand how a web host is legally responsible for what their users post as long as there’s active moderation removing it in a timely manner.

    • gabe [he/him]
      link
      fedilink
      862 years ago

      You are correct, there is safe harbor provisions on the matter. There is a legal responsibility to report and store the content securely when it is reported as an admin.

      • @[email protected]
        link
        fedilink
        English
        252 years ago

        It’s like it’s not enough that you deal with all the technical shit, updating to new versions, checking shit out from GitHub, running builds, paying for the goddamn thing, then you are also responsible for babysitting content? Fuck that. Unless you have a good group of mods/admins it is really difficult to do.

        • @[email protected]
          link
          fedilink
          172 years ago

          That’s why you either sell your users to the advertisers or charge a monthly subscription. Free internet doesn’t work.

            • @[email protected]
              link
              fedilink
              English
              72 years ago

              I can do all of the above, except for police content.

              And Reddit of course had unpaid mods to do that.

              So like I say, it can be done, you just need the right team of mods/admins for your own server.

              • @[email protected]
                link
                fedilink
                32 years ago

                Cool, you and your 5 buddies have a great time. Some of us would like to see a viable alternative to reddit who respects privacy, and doesn’t crash every other day.

                Fediverse is going to be known as a kiddie porn haven with the level of professionalism and maturity they have with the major servers.

                • @[email protected]
                  link
                  fedilink
                  English
                  22 years ago

                  I have yet to see a single problem with Lemmy over months of daily use. An instance may have crashed in that time but I didn’t notice not seeing certain instances when scrolling, and I don’t seek out particular communities. Helps that I’m hosted on a less popular instance, and the lemmy.ca admins seem to run a tight ship.

                  I block a couple of communities a day, but that seems to be expected. I also haven’t seen any kiddie porn.

                  Less discussion than Reddit, and less specific communities, but that’s been easy to forgive because well, fuck Reddit.

                  If an alternative pops up at some point, I’ll be sure to give it a try. Lemmy is doing just fine for me.

    • Natanael
      link
      fedilink
      112 years ago

      FYI in USA the law CDA section 230 only preempts state law but not federal law. If something which is federally illegal lands on your server you need to deal with it ASAP

  • Melody Fwygon
    link
    fedilink
    English
    19
    edit-2
    2 years ago

    My guess is that someone noticed that Lemmy doesn’t yet have as robust moderation tools as Mastodon and decided they’d federate "NoNoNo"1 images all over the place just to be a troll

    1
    CSAM
    CP

    Very illegal and naughty images of kids

  • @[email protected]
    link
    fedilink
    1
    edit-2
    2 years ago

    So why not disable images, including thumbnails? Wouldn’t that solve it? Imgur was created because reddit didn’t host images.

  • @[email protected]
    link
    fedilink
    22 years ago

    This is the biggest design flaw of lemmy.

    Instances should host separate content, and aggregation of separate instances should be up the client.

    Instead we got the worst of all worlds. It means that lemmy can never truly scale performance wise or survive legal wise.

    Hopefully they solve it in some way, but I don’t see how unless they do the above and totally remove cross instance caching

    • WtfEvenIsExistence1️OP
      link
      fedilink
      English
      3
      edit-2
      2 years ago

      aggregation of separate instances should be up the client.

      Yea you’ll have a problem with overloaded requests

      Example:

      A post is located at Instance A, it has 100 users and can handle up to 150 connections at a time

      Instance B has 100 users

      Instance C has 100 users

      Assuming all these users are from the same timezone and have the same one hour of free time after coming home from work

      All users at Instance B and C wants to see the post at Instance A

      200 requests gets sent to Instance A

      Instance A also receive 100 requests from it’s own users

      Instance A receives a total of 300 requests at around the same time. Instance is overloaded.

      All of Intances A, B, and C would need at least 300 connection capacity in order for this to work.

      Imaging there being 10 instances. All of them would need at least 1000 connection capacity. Each one of them.

      Instances need to cache other instances to reduce redundant requests. This way the requests are reduced to 1. Only one copy needs to be sent to each other instance, instead of being sent to each individual user of those instances.

      All 100 of each user at an instance could just use that 1 copy at their instance.

      So now in this scenario:

      Instance A has a post everyone wants to see, Instance A gets 100 requests + 1 from each other instance. Even with 10 total instances (including themself), there’d be only 109 requests, a far more manageable number than 1000.