Edit - This is a post to the meta group of Blåhaj Lemmy. It is not intended for the entire lemmyverse. If you are not on Blåhaj Lemmy and plan on dropping in to offer your opinion on how we are doing things in a way you don’t agree with, your post will be removed.
==
A user on our instance reported a post on lemmynsfw as CSAM. Upon seeing the post, I looked at the community it was part of, and immediately purged all traces of that community from our instance.
I approached the admins of lemmynsfw and they assured me that the models with content in the community were all verified as being over 18. The fact that the community is explicitly focused on making the models appear as if they’re not 18 was fine with them. The fact that both myself and one a member of this instance assumed it was CSAM, was fine with them. I was in fact told that I was body shaming.
I’m sorry for the lack of warning, but a community skirting the line trying to look like CSAM isn’t a line I’m willing to walk. I have defederated lemmynsfw and won’t be reinstating it whilst that community is active.
Good choice and thank you.
Editing to add, I read the post addressing this on the other instance, and the message screenshots between Ada and the nsfw admins, and the nsfw admin’s ignorance sure looks willful to me. They’re defending an nsfw community that had “child-like” as one of the adjectives they emphasize in the sidebar until after this defederation took place. Not acceptable.
deleted by creator
but so far Lemmy has not been shown to be a good platform for that (unless you’re on the “no restrictions” type of user)
I’m curious why you feel this way?
I get the feeling it’s harder to moderate the nsfw content that is posted in real time across multiple instances and even more communities. Anyone could poison the well with heinous content and it would take a moderator of that specific instance/community to remove that content, rather than having centralized moderators for illegal/deplorable content.
I’m just imagining the liability of nsfw content. Honestly think it’s an excellent idea ada defederated, I don’t think they’d want the legal risk. So many laws can be broken just by neglect. Revenge porn laws, depiction of actual SA, underage content slipping through, etc etc.
I was laughed out of the room for pointing this out the other day, and I still get laughed out of the room pointing out the massive privacy concerns and liability everyone is setting themselves up for.
Someone is going to get sued for children’s data and GDPR.
removed by mod
I’m not on this instance, but thank you for being so swift and resolute in your actions. Happy to see all due caution is being taken. Not so happy that such a community made it’s way here to the fediverse. Hopefully I won’t see any of it while doomscrolling.
Tbh shoutout to Connect, let’s me block whatever. My block list primarily the hundreds of gross, weird porn that has popped up on this site.
Best app I’ve used so far. Why are there so many furry porn communities anyway? They’re pretty much my whole blocklist
You don’t appreciate 46 different sub-genres of furry porn, each with a separate community, filling up your feed?
Honestly almost all the porn I see on here is straight, I’ve only seen furry a handful of times with casual scrolling.
I’ve seen the joke a few times now, I just think it’s become the thing to joke about. I’m on a yiff instance and have 3 yiff communities of my own here and my feed is still 90% human porn.
The instances that run the degenerate porn are all basically isolated and don’t get much traction. And people in the main fediverse don’t ever hear of them because trying to point one out never devolves into much of a reaction (except Burggit because Loli is an easy target)
Thank you. Just the spam in new was bad enough, but CSAM? Holy crap.
To be clear, it is not CSAM. It is legal porn deliberately designed to look like CSAM
In some jurisdictions that is still considered csam. Even if it’s animated or whatever excuse…
Like when Australia floated banning smaller-boobed women from porn, which is also something that everyone agreed with.
Floated? This is law
Great. Even better that they decided to actually enforce that only girls with big tits count as women and everything else is a child. That is an entirely rational and reasonable approach.
Only big tiddy goth gf allowed
-Australian government
It still feels in the grey area just like some anime
My “favorite” was the vampire who has the body of a little girl, but the argument was “in the story, she’s actually hundreds of years old, so she’s not a minor!” 🙄
Would you mind posting chat logs like the lemmynsfw team for transparency sake? Not trying to cause more drama but I think the whole thing just needs to be more transparent. Sorry if this is an out of line request.
The ones the lemmynsfw admins posted are accurate.
This post isn’t for the Fediverse, it’s an announcement to the users of Blåhaj as it impacts their experiences here.
I don’t intend to get in to he said/she said over it with the wider Fediverse
Thanks wasn’t trying to. Totally not here to pick sides or start trouble. Just wanted an end the speculation I was reading on both sides… I just wanted to make sure it was accurate. Thanks for the verification. Aggressive support to you and your community still 🫡 I’ll be off to my own areas.
Thanks :)
Never mind see I’m the comments below.
Child sexual abuse materials
Ugh. They need to be more than defederated, even if isn’t actually CSAM. Sick people.
Thank you. I was just about to search the acronym CSAM to find out what it meant. Eww.
FYI it’s not directly related but there was a story in the Washington Post today about Twitter rival Mastodon rife with child-abuse material, study finds … of course 90% of it came from a large Japanese site (presumably pawoo) but they also mentioned some originating on big mainstream sites; and some sites don’t block pawoo, so it’s potentially in their federated timelines. Here’s the underlying report, Child Safety on Federated Social Media
I am very disheartened by the number of people replying here who read “a community skirting the line trying to look like CSAM” and felt the need to go purposefully seek out that community to look through its images.
Probably because the community in question isn’t trying to “skirt the line” and just posts popular pornstars that range from 18 to the mid twenties. I thought it was a kink community until someone finally linked the lemmynsfw post and it’s actually just a community for cute pornstars.
Calling it CSAM-adjacent just means that nobody’s comfortable actually looking at it to figure out what’s going on, and hugely exaggerated.
It’s not about whether the community actually skirts the line or not. It’s about how many people thought “gee, someone thinks these pictures are CSAM-adjacent, I need to go see for myself”. That’s disheartening.
As an aside, I didn’t realize I was annoying you in two different comment chains until just now. Sorry about that lol.
To your point though, that’s why calling it CSAM-adjacent is an issue. Either you trust a stranger’s judgement of whether these legal pornstars’ bodies are morally wrong, or you feel morally wrong for checking to see if you agree or disagree with their assessment. Given the language used here, it’s unsurprising that the thread over on Lemmynsfw is completely different in tone where the community name wasn’t hidden and everyone could just see for themselves.
Oh please, no one here is calling anyone’s body “morally wrong”.
I don’t need to “see if [I] agree or disagree with [the admin’s] assessment.” It wouldn’t make any difference whether I do or not. And it doesn’t matter what the community’s name is. By going to look, I’d be knowingly putting myself in a position to potentially see something that looks like CSAM. Why would I want to do that??
But a lot of people made the choice to do that, presumably for the sake of arguing with an admin on an instance many of them don’t even use. That is disheartening.
By going to look, I’d be knowingly putting myself in a position to potentially see something that looks like CSAM. Why would I want to do that??
I mean, that’s literally my point. The way it’s presented makes it seem like this ultra-sketchy community that despite being entirely legal, is supposedly morally wrong. How is anyone supposed to determine whether this was a good idea or not, if the very idea of checking is portrayed as morally repugnant?
And this whole debate is literally declaring that legal adults don’t look right, and shouldn’t be allowed to post explicit images of themselves or other professional sex workers. It’s incredibly subjective.
How is anyone supposed to determine whether this was a good idea or not
Ada’s judgment is not infallible, but I’d rather trust her judgment than go personally look for something she initially (and admitted mistakenly) thought was CSAM. There are two possible outcomes: (1) I see something that looks similar to CSAM to me and I feel gross about it, or (2) I don’t see any problem with the content, but it doesn’t change anything because she’s the admin here and is still unwilling to host copies of it on her server where she evaluates anything that gets reported.
In either case, I can still enjoy content from LemmyNSFW elsewhere if I so choose — just not at Blahaj Zone.
And this whole debate is literally declaring that legal adults don’t look right, and shouldn’t be allowed to post explicit images
I think the two sides here are having different debates. Yes, there are legal adults who may appear underage, and they should have the same freedom any other adult has to post explicit pictures of themselves if they so choose. But a community that specifically encourages “child-like” content (as the community’s rules said at the time this decision was made) is going to gather multiple examples of this. Even if Ada fully trusts LemmyNSFW’s admins to 100% prevent any real CSAM from being federated, she’d still be exposed to reports of “potential CSAM” from there. She’s a community-building volunteer who willingly examines reported content that gets federated to Blahaj Zone, but she doesn’t want to view any more of it than is strictly necessary to protect her community. So she’s unwilling to federate with an instance that knowingly hosts such a community (even if the content is 100% legal) because it would cause more reports as time goes on. The content also upsets her on a personal level, which is fine — she’s a human being and is allowed to have feelings.
Other admins at other instances might not have the same aversion to this specific type of legal content that Ada does, so maybe they don’t mind having it copied onto their servers. That’s cool. The Fediverse is great like that, users aren’t stuck with the decisions of any single person in charge. Ada announced her decision so that all we Blahaj Zone users would know about it, and if any of us feel strongly enough (and clearly a number of people do), we can vote with our feet and go use one of those other instances so we also don’t lose access to the communities we use here.
This is my final comment on the matter. You may have the last word if you wish.
But a community that specifically encourages “child-like” content (as the community’s rules said at the time this decision was made) is going to gather multiple examples of this.
This is part of why the whole debate is is blown out of proportion. The community was for posting images of “adorable” pornstars, a direct clone of the reddit community that’s one of the largest nsfw subreddits and has been for nearly a decade. The mod made the stumble of posting the dictionary definition of “adorable” on the sidebar, and can you guess what hyphenated word was a part of that? The idea that there’s even a “this type of content” to have an aversion to feels ridiculous after seeing the community.
It’s not teen focused, nor attempting to simulate dubious content, it’s literally just pornstars looking cute. If the issue is gut-checking pornstars, the same thing is going to happen with the nsfw communities on this instance, barring a shift to milf-only posting instead of simply legal porn.
At any rate, I appreciate the civil last word, even if we still disagree.
If you browse all and sort by hot or popular on any of the Lemmy apps, posts from that community would pop up. It’s not some hidden community. I think a lot of people had already seen posts from there. I figured that it had to be some other community on there, as I never really saw anything that looked too suspect from the more popular posts that reached all. It’s petite pornstars.
Nobody is a bad person for looking to see what the blahaj admin was talking about and verify for themselves, either. I think most people figured that there is obviously no CSAM on there considering the community is still up and running, and they probably wanted to see if their morals align with the admin here.
You can’t just take someone’s word for truth on the internet these days.
That’s the thing. I have seen posts from there pop up on all. Nothing on there implies that it is trying to appeal to pedos. It’s just petitie pornstars. That’s it. I am a CSA survivor myself, and nothing on there gave me creepy vibes at all. This is a bit overblown in my opinion.
removed by mod
Oh stfu
deleted by creator
A well worded disagreement from outside the instance shouldn’t be valued less because the person doesn’t directly use the instance. The post is open to all of Lemmy, attempting to close it makes very little sense. Regardless of if it affects us or not, just as people have opinions on other countries, we should be allowed to call out behavior or actions that fall short of expectations lest our own instance/country be shaped to imitate it by those who see only agreement.
This argument makes no sense when someone could just create a brand new account on this particular instance, and say the exact same thing.
Reading the explanation, I would purchase you a beverage of your choice if you were local to me.
deleted by creator
Bloody hell this thread is a mess of people from other instances complaining. I wish Lemmy would add the ability to set a community as private to it’s instance. Or only commentable by instance members. If you’re not from this instance, this defederation doesn’t affect you and you should step off. The admins job here is to protect us, the users on this instance. Not appease you.
your meds, and Im not talking about your transition meds, take them
Thanks Ada ❤️, good to know that those sorts of things are kept off blåhaj!
If I believe the mod of the community in question is telling the truth, Seems like the incident in question was just a misunderstanding. The community name is
spoiler
adorableporn
I will refer to this as “the first community” in the following text.
The mod of the community copy/pasted the dictionary definition from vocabulary.com, which contains the word “childlike”.
IMO, the community in question is not trying to skirt the line of Child Sexual Abuse Material (CSAM). In fact, there is a subreddit of the same name which has absolutely nothing to do with people that appear underage.
That said, the same mod also moderates, and posts to a different community with a concerning name. The spoiler below shows the name and the first three paragraphs of the sidebar as they appear:
spoiler
Community is now open to posting. Posts not having verification info will be removed.
FauxBait is a place for sharing images and videos of the youngest-looking, legal-aged (18+) girls. If you like fresh, young starlets, this is the place for you!
Just to be clear: We only feature legal, consenting adults in accordance with U.S. Laws. All models featured were at least 18 years old at the time of filming.
Also, I’m not sure if the timestamps can be trusted, but said mod was instated as the only active mod of the first community at the same time that Ada made this post, which would mean that the mod account could not have been the one that wrote the original sidebar of the first community. Not sure what to make of that. For the sake of balance though, said mod does seem to be doing verifications of the age requirements. Also, the modlog for the first community shows two admin removals from at least 10 days before this debacle, both of which err on the side of caution, so at least the admins to seem to care about enforcing their rules.
The situation seems very muddy, but I personally don’t think the original incident was that big of a deal (assuming the mod is telling the truth). However, I certainly don’t blame the blahaj admins for defederating as it’s certainly the safest option. Wouldn’t want blahaj lemmy to get taken down :| Also happy to see less pron in my feed; I’m too lazy to block the individual /c/. Personal Instance-level blocking can’t come soon enough.
I’m too lazy to block the individual /c/. Personal Instance-level blocking can’t come soon enough.
A lot of the apps for Lemmy do actually have this feature. It’s one of the reasons I used them and not the website.
but I personally don’t think the original incident was that big of a deal
The post I saw looked like an underage teenage girl. It was reported as child porn and looked like it to me before I even looked at the community.
Then when I looked at the community, I discovered it wasn’t accidental. The whole point of the community is to appeal to folks looking for people that look like underage teenagers.
That’s a pretty big deal.
The whole point of the community is to appeal to folks looking for people that look like underage teenagers.
It’s not though? Only the other community is like that. Still, defederating is probably the best choice indeed.