More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:
I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.
While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”
To be clear — what McKenzie is saying here is that Substack will continue to pay Nazis to write Nazi essays. Not just that they will host Nazi essays (at Substack’s cost), but they will pay for them.
They are, in effect, hiring Nazis to compose Nazi essays.
Not exactly. Substack subscribers pay subscription fees, the content author keeps roughly 80% of the fees, and the rest goes to Substack or to offset hosting costs. The Nazi subscribers are paying the Nazi publishers, and money is flowing from the Nazi subscribers to Substack because of that operation (not away from Substack as it would be if they hired Nazis).
That’s splitting hairs. Salespeople who work on commission are keeping an amount of what they make for the company, but I doubt many people would claim they aren’t being paid to sell a product.
They are being paid by subscribers, not by substack. I am not on substack’s side here, but that detail seems quite relevant if we’re interested in painting an accurate picture of what’s going on.
If they were putting Nazi content on substack and no individuals were subscribing to read it, they would be earning 0.
Substack is profiting from those same subscribers, no doubt.
They are being paid by subscribers, not by substack.
Again- If you sold widgets door-to-door for a 20% commission, would you say you were being paid by the people who buy the widgets? I doubt many would.
In that case I’d be selling something made by the entity giving me commission - what people want and pay for is something made by someone other than me. In this case the people creating the content are the same people drawing the subscribers, so it’s more accurate to say substack takes a cut of their subscription income than to say substack pays them.
If I stop selling widgets the company still has the exact same widgets and can get anyone else to sell them. If a renowned nazi writer (bleh) takes their content to another platform, substack no longer has that content (or the author’s presence on their platform) to profit from.
what people want and pay for is something made by someone other than me.
Sort of like Substack’s servers then?
You think the platform is the widget, I think the content is the widget. I guess we’ll have to agree to disagree.
removed by mod
How is it pedantic to point out that “will pay for them” means “will get paid by them”?
There’s a perfectly good argument to be made that Substack shouldn’t host Nazis even if they’re making money off them. But that wasn’t (edit:
yourthe) message;yourthe message was, they’re hiring Nazis. It’s relevant whether they’re materially supporting the Nazis, or being materially supported by a cut of their revenue.It wasn’t my message, but it certainly made sense to me and still does. whereas your message makes sense but in a totally different way. It’s basically “nuh-uh”
Hm. Fair enough. The core complaint I have with banning Nazis from being able to speak, has nothing to do with which way the money is flowing. And I fixed “your” to be “the”; I just hadn’t noticed you weren’t the person I was talking with before.
So they’re Nazis
deleted by creator
deleted by creator
Submitted for good faith discussion: Substack shouldn’t decide what we read. The reason it caught my attention is that it’s co-signed by Edward Snowden and Richard Dawkins, who evidently both have blogs there I never knew about.
I’m not sure how many of the people who decide to comment on these stories actually read up about them first, but I did, such as by actually reading the Atlantic article linked. I would personally feel very uncomfortable about voluntarily sharing a space with someone who unironically writes a post called “Vaccines Are Jew Witchcraftery”. However, the Atlantic article also notes:
Experts on extremist communication, such Whitney Phillips, the University of Oregon journalism professor, caution that simply banning hate groups from a platform—even if sometimes necessary from a business standpoint—can end up redounding to the extremists’ benefit by making them seem like victims of an overweening censorship regime. “It feeds into this narrative of liberal censorship of conservatives,” Phillips told me, “even if the views in question are really extreme.”
Structurally this is where a comment would usually have a conclusion to reinforce a position, but I don’t personally know what I support doing here.
IDGAF if it feeds into the narrative. It also shuts down a recruitment pipeline. It reduces their reach. It makes the next generation less likely to continue the ideology. De-platforming is a powerful tool that should be reserved for only the most crucial fights, but the fight against Nazi is one of those fights.
The Nazis were already full-blown conspiracy theorists. EVERYTHING is spun to feed into their narrative. That ship has sailed.
A platform operator needs to AT MINIMUM demonetize the content and censure it, and is likely only being responsible if they ban it outright. If you aren’t prepared to wade into the fraught, complex world of content moderation, don’t run a content platform.
this is enablement
Gen Z needs to understand the historical lesson that the Blues Brothers taught those before them. Illinois Nazis exist, and some days they demonstrate, as per their right to freedom of speech - but this is as much as an opportunity to humiliate them and openly critique the mindset as anyone else. Dark little underground communities flourish behind closed doors.
Jesus Christ censorship has become such a meaningless word now.
Yeps. It went from government actions to some company not wanting a Nazi troll.
I just want to make it clear that we don’t like Nazis either
Actions speak louder than words. Fuck Substack and fuck any platform that offers a safe haven for nazis.
“I don’t like Nazis… but you have to understand, they’re very profitable.”
“I want you to know that I don’t like nazis. But I am fine platforming them and profiting from them. Now here is some bullshit about silencing ‘ideas.’”
This is plainly irresponsible.
Ehhh, it’s one of those things where I agree with the principle, but the principle fails. It’s the so called tolerance paradox (which isn’t actually a paradox at all, but that’s tangential).
On principle, no company should be in the business of deciding what is and isn’t acceptable “speech”. That’s simply not something we really want happening.
But then there’s nazis and other outright insane bigots. But we still don’t really want companies making that call, because they’ll decide on the side of profit, period. If enough of the nazi types get enough power and money going, every single fucking company out there that isn’t owned by a single person, or very small group of people that share the same ideals, is going to be deciding that it’s the nazi bullshit that’s the only acceptable speech.
This is something that has to come from the bottom to the top and be decided on a legal level first. We absolutely can ban nazi type bullshit if we want to. There’s plenty of room for it to be pointed at as the incitement to violence that it is. There need to be very specific, very limited definitions to govern what is and isn’t part of that
And the limitations have to be impossible to expand without starting all the way over with the kind of stringency it takes to amend the constitution.
That takes it out of the hands of corporations, and makes it very difficult to game. But it has to come from us, as a people first.
Mckenzie needs to read that Reddit story about the bartender who kicked out a guy with the Third Reich eagle ensign on his shirt despite him quietly minding his own business. I really don’t want Substack to “suddenly become a Nazi bar.” I’m just a reader, but if I ever start a newsletter I may reconsider my platform. I am on a basic free plan for all Substack channels I read. I’ve thought about upgrading my subscription to some, but now I will hesitate.
Tolerating Naziism and allowing it to use social tools to spread its hate is what makes it worse.
Kicking them off the platform just sends them to other echo chambers like False social where they just circle jerk each other all day unchallenged.
And then people wonder why we’re so scared of Facebook if the fediverse is “supposed to be open”.
The answer is literally in front of you, people!
Nazism doesn’t deserve tolerance, any person who doesn’t punch it in the face is equal or worse.
Monetization of such content is questionable for sure, but I’m affirmative about what he says about the propagation of such extreme views. Simply being unaware about such things won’t make them go away. People should know who they are and why they are so we can deal with them better. There’s alot we can do better but can’t do because of limited awareness and our own negative attitude to deal with them.