YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.

  • @[email protected]
    link
    fedilink
    English
    282 years ago

    The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.

    This seems like the only part of the suits that might have traction. All the other bits seem easy to dismiss. That’s not a statement on whether others share responsibility, only on what seems legally actionable in the US.

    • Phoenixz
      link
      fedilink
      English
      62 years ago

      The gun store owner couldn’t have known that any gun he’d sell would be used within moments, to take innocent lives.

      Hundreds, thousands of deaths due to gun violence committed right after the gun was bought would disagree with you

        • Phoenixz
          link
          fedilink
          English
          12 years ago

          No, the gun owner pulled that trigger. Take away the gun and he can’t pull that trigger anymore.

          It’s not that hard, and mental gymnastics won’t help your cause

            • @[email protected]
              link
              fedilink
              English
              22 years ago

              You don’t know how I think, this is our first interaction ever. How can you know what I think? Why are you not able to tell me what guns are for? They are for killing. Either hunting animals or shooting people.

              Are there any other uses for guns besides that? What, target practice for funsies? Where you… shoot a silhouette of a human? (I’m sure some places just have a target instead but what else would you be practicing shooting for?)

    • @[email protected]
      link
      fedilink
      English
      32 years ago

      If you don’t believe online words can sway people’s beliefs/opinions or drive them to action then why are you leaving this comment here?

      • @[email protected]
        cake
        link
        fedilink
        English
        12 years ago

        I didn’t say that. You’re putting words into my mouth. It still took a human to take up arms and use a tool. Youtube alone didn’t do this. Reddit alone didn’t do this. Guns alone didn’t do this. Training and a license would not have prevented this.

        • @[email protected]
          link
          fedilink
          English
          12 years ago

          Now you’re putting words into everyone else’s mouth. Who said any of those things are solely responsible for this tragedy? Why are you arguing as if the shooter is walking free while prosecutors go after YouTube and Reddit?

          • @[email protected]
            cake
            link
            fedilink
            English
            12 years ago

            Then why are they suing Youtube and Reddit if they don’t think those platforms are responsible? I’m saying that you people want to blame every fucking other thing in the world, EXCEPT for the person doing the fucking crime. How am I arguing that the shooter is free? I’m literally saying, this kid would have likely done this with or without those things and the stupid idea that a license would prevent it is asinine.

            • @[email protected]
              link
              fedilink
              English
              22 years ago

              How am I arguing that the shooter is free?

              This right here:

              I’m saying that you people want to blame every fucking other thing in the world, EXCEPT for the person doing the fucking crime

              The shooter is facing his day in court as will the companies who helped drive him to commit this tragedy.

              • @[email protected]
                cake
                link
                fedilink
                English
                12 years ago

                lol, ok. sure buddy. Any yet we get constant cries for the removal of guns because without guns this wouldn’t happen right?

                • @[email protected]
                  link
                  fedilink
                  English
                  22 years ago

                  Not sure what this has to do with the rest of the discussion, but no I don’t think mass shootings would be very common if guns didn’t exist.

            • @[email protected]
              link
              fedilink
              English
              12 years ago

              I don’t know anything about it, I can’t say for sure. Probably the guy couldn’t get a gun, if I’d have to say.

              • @[email protected]
                cake
                link
                fedilink
                English
                12 years ago

                You’re arguing that getting a license would somehow prevent this shooting? The kid bought the guns legally as it is. They were not illegally obtained. So getting a license is just one more hoop to jump through. It wouldn’t have stopped anything, IMO.

                • @[email protected]
                  link
                  fedilink
                  English
                  12 years ago

                  The thing about hoops is that they do prevent a lot of things.

                  Not all of them, but a lot.

                  “Ah, that’s too much to bother” is surprisingly a good deterrent.

  • AutoTL;DRB
    link
    fedilink
    English
    162 years ago

    This is the best summary I could come up with:


    YouTube, Reddit and a body armor manufacturer were among the businesses that helped enable the gunman who killed 10 Black people in a racist attack at a Buffalo, New York, supermarket, according to a pair of lawsuits announced Wednesday.

    The complementary lawsuits filed by Everytown Law in state court in Buffalo claim that the massacre at Tops supermarket in May 2022 was made possible by a host of companies and individuals, from tech giants to a local gun shop to the gunman’s parents.

    The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.

    YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack.

    “We aim to change the corporate and individual calculus so that every company and every parent recognizes they have a role to play in preventing future gun violence,” said Eric Tirschwell, executive director of Everytown Law.

    Last month, victims’ relatives filed a lawsuit claiming tech and social media giants such as Facebook, Amazon and Google bear responsibility for radicalizing Gendron.


    I’m a bot and I’m open source!

  • @[email protected]
    link
    fedilink
    English
    26
    edit-2
    2 years ago

    Ahh one of those “We’re mad and we don’t have anyone to be angry with.” style lawsuits. Pretty much the Hail Mary from a lawyer who is getting their name in the paper but knows it won’t go anywhere.

    “Easy to remove gun lock” that has been tried multiple times and usually fails. “Gun lock” doesn’t seem to be related to assault weapons and large capacity magazine but who knows what they mean, even when a gun is “Easily modifiable” it’s usually not treated as illegal, because someone has to actually make those modifications. The same will probably be the case for the kevlar. (at the time of the shooting it was legal).

    Youtube contributing to radicalization is a laugh, it’s an attempt to get their name in the papers and will be dismissed easily. They’d have better chance to name the channels that radicalized him, but first amendment rights would be near absolute here. Besides which “Radicalization” isn’t the same as a conspiracy or orders. It’s the difference between someone riling up the crowd until they’re in a fervor which ends up in a riot, and someone specifically telling people how to riot and who to target. (Even if can be tried as crimes, one is a conspiracy, one is not, and even that “radicalization” would be neither.) Even “I wish someone would go shoot up …” would be hyperbole, and thrown out as well. It’s pretty hard to break the first amendment protections in America (And that’s a good thing, if you think it’s not imagine if the other party is in power and wants to squash your speech… yeah let’s keep that amendment in place).

    The same will be the case against Facebook for all the same reasons.

    If you think Google should be responsible, then you think the park that someone is radicalized in should be responsible for what’s said in it, or the email provider is responsible for every single piece of mail that is sent on it, even though it might not have access to see that mail… it’s a silly idea even assuming they could even do that. Maybe they’re hoping to scare Google to change it’s algorithm, but I doubt that will happen either.

    The case against the parents is another one that people try and again… unless there’s more than their saying, you still can’t sue someone for being a bad parent. Hell there’s a better case against the parents of Ethan Crumbley, and even that cases is still pretty shaky, and involved the parents actively ignoring every warning sign, and buying the kid the gun. This there’s nothing that seems to be pinnable on the parents.

    You know it sucks and I know there’s a lot of hurt people but lawsuits like this ultimately fail because it’s like rolling the dice, but history pretty much shows this is hoping for a one in a million chance that they get lucky, and they won’t, because it’s one in a million, and then they’d have to hope it’s not overturned even if they do win.

  • adroit balloon
    link
    fedilink
    English
    102 years ago

    interesting… whether the sites will be found liable…. it’s pretty unlikely, but it sure does shine a spotlight on how each are magnets for alt-right crazies. I wonder if that will have any effect on their moderation?

    I doubt it.

    • @[email protected]
      link
      fedilink
      English
      142 years ago

      They’re also “magnets” for progressive, liberal, conservative and all other crazies and normal people. That’s mostly because everyone uses them. It’s the most popular video sharing site and (one of?) the most popular social media site.

      • adroit balloon
        link
        fedilink
        English
        23
        edit-2
        2 years ago

        yeah, but progressives and liberals and all other “crazies and normal people” aren’t the ones committing mass shootings all the time.

        • @[email protected]
          link
          fedilink
          English
          4
          edit-2
          2 years ago

          Right, but since YouTube and Facebook are two of the most popular sites in the world, they aren’t really just magnets for alt-right crazies, since they appeal to almost everybody.

          • adroit balloon
            link
            fedilink
            English
            14
            edit-2
            2 years ago

            right, but “everybody” aren’t the ones committing mass shootings all the time. that’s an alt-right crazies problem.

              • adroit balloon
                link
                fedilink
                English
                11
                edit-2
                2 years ago

                Ok so isn’t the issue at hand whether the sites are to blame?

                let’s break this down so I can answer you in what I think is an honest way:

                1. Are the sites legally responsible for the content they host, generally speaking and/or in this context of radicalization and such subsequent results as these?

                and

                1. Do these sites bear any social/moral responsibility to moderate their more extreme content in good faith to try to prevent this sort of result?

                and

                1. Is there an overlap of 1 and 2?

                1 - this is for a court to decide. I’m not familiar enough with the very specifics of case law or with the suits being brought to know exactly what is being alleged, etc. I can’t opine on this other that to say that, from what I do know, it’s unlikely that a court would hold these sites legally responsible.

                2 - I fully believe that, yes, sites like these, massive, general-use public sites have a social and moral responsibility to keep their platforms safe. How and what that means is a matter for much debate, and I’m sure people here will do just that.

                3 - is there overlap? again, legally, I’m not sure, but there might be, and in the near future, there might be much more. also, should there be more? another subject for debate.

            • @[email protected]
              link
              fedilink
              English
              42 years ago

              I didn’t say they were. Facebook and YouTube didn’t commit the shootings, and there isn’t anything particularly special about them that would disproportionately attract the alt-right crazies. They’re not hate sites.

  • @[email protected]
    link
    fedilink
    English
    352 years ago

    They’re just throwing shit at the wall to see what sticks hoping to get some money. Suing google for delivering search results? It shows how ridiculous blaming tools is. The only person liable here is the shooter.

    • @[email protected]
      link
      fedilink
      English
      35
      edit-2
      2 years ago

      Well, maybe. I want to be up-front that I haven’t read the actual lawsuit, but it seems from the article that the claim is that youtube and reddit both have an algorithm that helped radicalize him:

      YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack. Similarly, the lawsuits claim Reddit promoted extreme content and offered a specialized forum relating to tactical gear.

      I’d say that case is worth pursuing. It’s long been known that social media companies tune their algorithms to increase engagement, and that pissed off people are more likely to engage. This results in algorithms that output content that makes people angry, by design, and that’s a choice these companies make, not “delivering search results”.

    • @[email protected]
      link
      fedilink
      English
      202 years ago

      The only person liable here is the shooter.

      On the very specific point of liability, while the shooter is the specific person that pulled the trigger, is there no liability for those that radicalised the person into turning into a shooter? If I was selling foodstuffs that poisoned people I’d be held to account by various regulatory bodies, yet pushing out material to poison people’s minds goes for the most part unpunished. If a preacher at a local religious centre was advocating terrorism, they’d face charges.

      The UK government has a whole ream of context about this: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/97976/prevent-strategy-review.pdf

      Google’s “common carrier” type of defence takes you only so far, as it’s not a purely neutral party in terms, as it “recommends”, not merely “delivers results”, as @joe points out. That recommendation should come with some editorial responsibility.

      • @[email protected]
        link
        fedilink
        English
        52 years ago

        This is more akin to if you sold a fatty food in a supermarket and someone died from being overweight.

        Radicalizing someone to do this isn’t a crime. Freedom of speech isn’t absolute but unless someone gives them actual orders it would still be protected.

        Don’t apply UK’s lack of freedom of speech in American courts.

        • @[email protected]
          link
          fedilink
          English
          62 years ago

          Don’t apply UK’s lack of freedom of speech in American courts.

          🙄

          It is a felony under federal law to intentionally “solicit, command, induce, or otherwise endeavor to persuade” another person to engage in a crime of violence against a person or property. 18 U.S.C. § 373. https://www.law.georgetown.edu/icap/wp-content/uploads/sites/32/2020/12/Fact-Sheet-on-Threats-Related-to-the-Election.pdf

          Specific text: https://www.law.cornell.edu/uscode/text/18/373

          • @[email protected]
            link
            fedilink
            English
            2
            edit-2
            2 years ago

            Oh pretending you were always talking about US when BOTH of your previous links are from the UK? Come on bro…

            And you’re citing a law and not considering how it’s applied for the last couple centuries or even years. In very broad terms, you can’t just claim they said something inflamatory and that person did something. For the most part they need to be rather specific for that law to apply.

            “Someone should do something about that mosque” isn’t the same as saying “Someone should blow up that specific mosque”. And almost every time this comes up the radicalization knows how to avoid going over the line. But if I posted a message that said “someone should blow up that mosque” It would be myself that would get in trouble, not lemmy, or Youtube or where ever I posted it.

            The problem is “Solicit, command, induce, or otherwise endeavor to persuade” That’s usually far more specific than you seem to think. It’s part of the way organized crime was able to survive so long, until RICO cases were made, and those cases basically bypass this by saying there’s a (Criminal) “enterprise”.

            The other problem you have is complaining about the “Algorithm” but not understanding that itself would likely be a defense in that it’s designed to promote retention, not radicalization, but that would even assume it’ll get to court, which in this case it’ll almost certainly not. The fact they’re not going after a specific person probably means they’re targeting a vague “radicalization” which hey, you have a good point in your first link. The radicalization would be illegal under UK law. But if he did in the US, he likely would not be in jail.

            But then again we don’t jail people for teaching dogs to do the nazi salute, so yeah, strange. We have different laws here that I still don’t think you understand.

            • @[email protected]
              link
              fedilink
              English
              42 years ago

              I know perfectly well the laws of your country, and that the links I originally posted apply to the UK. My comments were about principles, rather than the specifics of US law, which again could apply to the US.

              Google is quite wilfully recommending certain things that increase engagement, they’re metric-ed up the eye balls. Facebook has internal documents that clearly state they know they’re actively promoting harmful content.

              But then again we don’t jail people for teaching dogs to do the nazi salute, so yeah, strange.

              He was not jailed, he was fined and it was for saying things “antisemitic and racist in nature”. The link has some of the things he said that are clearly not so innocuous as you seem to portray given the rise of the right wing. The whole “it’s a joke” defence is also pretty well documented as a modern phenomena of the right wing.

              You are misinformed and if you have any sympathies for that guy, you have the wrong priorities at best, or at worst are resorting to the usual alt right talking points.

              As a matter of principle, you’re right on one account, which is that I do not place the ultimate value on freedom of speech. The fact that American companies have a strangle hold over the public sphere and the dynamics of speech is problematic.

              • @[email protected]
                link
                fedilink
                English
                2
                edit-2
                2 years ago

                My comments were about principles,

                So absolutely has no value in this discussion, thanks for clarifying.

                The link has some of the things he said that are clearly not so innocuous as you seem to portray given the rise of the right wing.

                I didn’t click this link, because I don’t really care. My father was Jewish, and he could say all Jewish people should be killed and I still would say he doesn’t deserve to be put in jail. Sorry, your outrage doesn’t override the first amendment. It’s not “It’s a joke” defense… it’s “There’s freedom of speech”. Hard stop. Are their limitations to it? Sure, but I’m pretty sure he’s not hitting those bars.

                You are misinformed

                No you’re talking about “Principles” which means you’re in the wrong topic and the wrong discussion. And you’re not misinformed, but willfully ignoring the reality of the situation. Maybe you’re angry you’re not right and you’re trying to defend your position, but here’s the thing, your position doesn’t matter, the law matters… And no one is keeping score, so it’s ok, you’re wrong here, just stop making up shit.

                at worst are resorting to the usual alt right talking points.

                I always love this point. “If you don’t agree with me, you’re the enemy.” I guess the ACLU is the Alt-Right, as is any lawyer who defends someone charged with saying something that hurt someone’s feelings.

                As for “priorities”. If you think freedom of speech isn’t important, let’s think about that. It’s great right now, Nazi’s can’t say shit, you can say anything you want to them. But what’s that, a future where someone you don’t like is in power, and suddenly you can’t say anything and some party (potentially Nazis) can… Oh shit, well maybe Freedom of Speech IS actually important.

                As a matter of principle,

                I’ll repeat this again, “principles” don’t matter, laws do.

                which is that I do not place the ultimate value on freedom of speech

                That’s fine, but we’re all talking about an American case, let’s focus on American laws, and not “What dublet feel is right”.

                This is the last time I’m responding to you because you’ve made it clear you’re talking about the world according you. I live in a real place, with actual laws, where this case is taking place. It’s called the United States of America. It doesn’t matter where you live, it doesn’t matter what laws apply to you. We’re talking about a specific place and specific laws. When you want to talk about those laws… well find someone else because you’ve already wasted enough of my time, but until you focus on how the world actually works, really no one should waste their time discussing your version of the law… because it has no basis in reality.

                • @[email protected]
                  link
                  fedilink
                  English
                  32 years ago

                  My father was Jewish, and he could say all Jewish people should be killed and I still would say he doesn’t deserve to be put in jail

                  Roseanne Barr is jewish and recently denied the holocaust but also said that it should have happened.

                  Sure we’re not gonna put her in jail but she’s a guaranteed laughing stock and everyone knows it.

                  She straight up wrecked her career with that kind of thinking.

                  You don’t have to go to jail for everyone to hate you for what you are. Have fun not being in jail lol

        • @[email protected]
          link
          fedilink
          English
          12 years ago

          This is more akin to if you sold a fatty food in a supermarket and someone died from being overweight

          Do you not remember those two girls who tried to sue McDonald’s for making them fat?

          It prompted a movie and a book…

          • @[email protected]
            link
            fedilink
            English
            3
            edit-2
            2 years ago

            And how did that case end?

            Hint: Not well You can try to sue anyone for anything. There’s just no guarentee it’ll work, and it didn’t there.

            There are cases that do work, such as about transfats but that is about specifically misleading someone, not supplying something unhealthy. Also that was settled, not fully through the courts.

            • @[email protected]
              link
              fedilink
              English
              22 years ago

              so?

              the case in OP is still going on so we don’t know how it will end yet. I was just pointing out something that already happened cause the metaphor used matched that case. Like, it was funny to mention something like that when it already happened and we know how it played out.

              We don’t know how the case in OP is gonna play out. You can’t predict the future.

        • @[email protected]
          link
          fedilink
          English
          42 years ago

          This is more akin to if you sold a fatty food in a supermarket and someone died from being overweight.

          No. It’s actually more akin to someone designing a supermarket that made it near impossible for a fat person to find healthy food and heavily discounted fatty foods and someone died from being overweight.

          • @[email protected]
            link
            fedilink
            English
            12 years ago

            And that still would be legal.

            Mcdonalds has existed for decades with that model. The only lawsuits against them are usually settled, and about shit where they knowingly lied like about Transfats. You can’t blame Mcdonalds for your unhealthy eating, you can’t blame one supermarket because it doesn’t sell what you think is healthy. So sure, your version is perfectly fine too… and yet is still legal.

            Ever been to a candy store? A chocolate shop? Even Cheescake Factory is really unhealthy in general and still is a major chain? At some point personal responsibility is what it comes down to.

  • @[email protected]
    link
    fedilink
    English
    28
    edit-2
    2 years ago

    The article doesn’t really expand on the Reddit point: apart from the weapon trading forum, it’s about the shooter being a participant in PoliticalCompassMemes which is a right wing subreddit. After the shooting the Reddit admins made a weak threat towards the mods of PCM, prompting the mods to sticky a “stop being so racist or we’ll get deleted” post with loads of examples of the type of racist dog whistles the users needed to stop using in the post itself.

    I don’t imagine they’ll have much success against Reddit in this lawsuit, but Reddit is aware of PCM and its role and it continues to thrive to this day.

    • @[email protected]
      link
      fedilink
      English
      32 years ago

      In the USA it’s not a crime to be racist, promote a religion teaching that God wants you to be racist, say most racist things in public, or even join the American Nazi Party. The line is set at threatening, inciting, or provoking violence, and judges don’t accept online arguments that saying racist garbage is inherently threatening.

    • DigitalTraveler42
      link
      fedilink
      English
      112 years ago

      PCM isn’t just a Right wing subreddit, it’s a Nazi recruitment sub under the guise of “political discussion”.

    • @[email protected]
      link
      fedilink
      English
      82 years ago

      I just took a casual look at that sub and noped the fuck out. Sad to see how active a toxic community like that is, though not really surprising.

    • @[email protected]
      link
      fedilink
      English
      12 years ago

      He wasn’t a participant. I was a mod there before I immolated my Reddit account and day it happened I trudged through his full 196 page manifesto. It mentions PCM exactly 0 times. What does he mention in it? /pol/ and /k/ specifically. With /pol/ taking around 40% of the entire manifesto. He made a single comment on /r/pcm. That comment? “Based.” We have/had nearly 600k users, 150k active weekly. One person making one comment does not judge the community. He was active on other parts of Reddit as well. Much more than ours.

    • @[email protected]
      link
      fedilink
      English
      42 years ago

      Who would be the right one to sue? Reddit is hosting it, but they are using admins to keep discussion civil and legal; the admins of PCM are most likely not employed by Reddit, but are they responsible for users egging each other on? At what point is a mod responsible for users using “free speech” to instigate a crime? They should have picked a few posts and users and held them accountable instead of going for the platform. People will keep radicalizing themselves in social media bubbles, in particular when those bubbles are not visible to the public. Muting discussion on a platform will just make them go elsewhere or create their own. The better approach would be to expose them to different views and critique of what they are saying.

      • lemmyvore
        link
        fedilink
        English
        102 years ago

        There’s admins and there’s moderators (mods). Please clarify which you mean.

        Admins are Reddit employees and are supposed to enforce site-wide rules outlined in their policy and terms of use.

        Moderators are unpaid volunteers whose identity is typically unknown to Reddit who are in charge of running a sub. Moderators can make up additional rules and enforce them.

  • 【J】【u】【s】【t】【Z】
    link
    fedilink
    English
    972 years ago

    Fantastic. I’ve been waiting to see these cases.

    Start with a normal person, get them all jacked up on far right propaganda, then they go kill someone. If the website knows people are being radicalized into violent ideologies and does nothing to stop it, that’s a viable claim for wrongful death. It’s about foreseeability and causation, not about who did the shooting. Really a lot of people coming in on this thread who obviously have no legal experience.

    • Phoenixz
      link
      fedilink
      English
      62 years ago

      Really a lot of people coming in on this thread who obviously have no legal experience.

      Like you

        • @[email protected]
          link
          fedilink
          English
          32 years ago

          Does remindmebot exist on Lemmy? I’d be very interested in a friendly wager.

          Loser has to post a pic in a silly shirt!

          • 【J】【u】【s】【t】【Z】
            link
            fedilink
            English
            12 years ago

            I don’t know but I’m 3 for 3 on these.

            Bet that Supreme Court would uphold ATF interpretation on bump stock ban. That appeals courts would find a violation of 1A where Trump and other political figures blocked constituents on social media. And I bet that Remington was going to be found liable in the Sandy Hook lawsuit on a theory not wholly dissimilar from the one we’re talking about here. I’m pretty good at novel theories of liability.

    • @[email protected]
      link
      fedilink
      English
      32 years ago

      The catch is whether the site knows that specific individual is being radicalized. If admins aren’t punishing the account regularly I wonder how difficult it will be to prove reddit/YT specifically pushed this guy.

    • GreenBottles
      link
      fedilink
      English
      492 years ago

      I just don’t understand how hosting a platform to allow people to talk would make you liable since you’re not the one responsible for the speech itself.

      • @[email protected]
        link
        fedilink
        English
        22 years ago

        We should get the thought police in on this also, stop it before it has a chance to spread. For real though, people need to take accountability for their own actions and stop trying to deflect it onto others.

      • @[email protected]
        link
        fedilink
        English
        162 years ago

        Because you are responsible for hiring psychologists to tailor a platform to boost negative engagement, and now there will be a court case to determine culpability.

        • @[email protected]
          link
          fedilink
          English
          62 years ago

          Reddit is going to have to make the argument that it just boosts “what people like” and it just so happens people like negative engagement.

          And I mean it’s been known for decades that people like bad news more than good news when it comes to attention and engagement.

          • @[email protected]
            link
            fedilink
            English
            62 years ago

            They probably will take that argument but that doesn’t instantly dissolve them of legal culpability.

      • @[email protected]
        link
        fedilink
        English
        502 years ago

        Is that really all they do though? That’s what theyve convinced us that they do, but everyone on these platforms knows how crucial it is to tweak your content to please the algorithm. They also do everything they can to become monopolies, without which it wouldn’t even be possible to start on DIY videos and end on white supremacy or whatever.

        I wrote a longer version of this argument here, if you’re curious.

        • @[email protected]
          link
          fedilink
          English
          112 years ago

          This is a good read, I highly suggest people click the link. Although it is short enough that I think you could have just posted it into your comment.

          • @[email protected]
            link
            fedilink
            English
            82 years ago

            Yes, but then I couldn’t harvest all your sweet data.

            Kidding! It’s a static site on my personal server that doesn’t load anything but the content itself. It’s mostly just a PITA to reformat it all mobile.

      • Pyr
        link
        fedilink
        English
        152 years ago

        I agree to a point, but think that depending on how things are structured on the platform side they can have some responsibility.

        Think of facebook. They have algorithms which make sure you see what they think you want to see. It doesn’t matter if that content is hateful and dangerous, they will push more of that onto a damaged person and stoke the fires simply because they think it will make them more advertisement revenue.

        They should be screening that content and making it less likely for anyone to see it, let alone damaged people. And I guarantee you they know which of their users are damaged people just from comment and search histories.

        I’m not sure if reddit works this way, due to the upvotes and downvote systems, it may be moreso the users which decide the content you see, but reddit has communities which they can keep a closer eye on to prevent hateful and dangerous content from being shared.

      • YeetPics
        link
        fedilink
        English
        72 years ago

        Tell that to the admins of lemmy.world defederating from communities because they may be held liable for what shows up on their website.

      • 【J】【u】【s】【t】【Z】
        link
        fedilink
        English
        112 years ago

        They set the culture.

        Did reddit know people were being radicalized toward violence on their site and did they sufficiently act to protect foreseeable victims of such radicalization?

  • @[email protected]
    link
    fedilink
    English
    12 years ago

    Can’t see how the lawsuit on the tech giants gets passed Section 230, which is unfortunate as Spez and the people who run Youtube willfully helped enable and encourage this shooter.

    • @[email protected]
      link
      fedilink
      English
      32 years ago

      You argue that the product is faulty, you don’t play with 230. That’s my guess as to their strategy, as its the same strategy other lawyers are attempting to use.

  • Brownian Motion
    link
    fedilink
    English
    1
    edit-2
    2 years ago

    FTFY.

    "YouTube, Reddit and a body armour manufacturer were among the businesses that helped enable the gunman who killed 10 Black people in an racist attack at a Buffalo, New York, supermarket, according to a pair of lawsuits announced Wednesday.

  • @[email protected]
    link
    fedilink
    English
    372 years ago

    YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack. Similarly, the lawsuits claim Reddit promoted extreme content and offered a specialized forum relating to tactical gear.

    Yeah this is going nowhere.

  • GreenBottles
    link
    fedilink
    English
    12 years ago

    I have a feeling no one here ever ran a website in their life