Edit - This is a post to the meta group of Blåhaj Lemmy. It is not intended for the entire lemmyverse. If you are not on Blåhaj Lemmy and plan on dropping in to offer your opinion on how we are doing things in a way you don’t agree with, your post will be removed.

==

A user on our instance reported a post on lemmynsfw as CSAM. Upon seeing the post, I looked at the community it was part of, and immediately purged all traces of that community from our instance.

I approached the admins of lemmynsfw and they assured me that the models with content in the community were all verified as being over 18. The fact that the community is explicitly focused on making the models appear as if they’re not 18 was fine with them. The fact that both myself and one a member of this instance assumed it was CSAM, was fine with them. I was in fact told that I was body shaming.

I’m sorry for the lack of warning, but a community skirting the line trying to look like CSAM isn’t a line I’m willing to walk. I have defederated lemmynsfw and won’t be reinstating it whilst that community is active.

  • ‘Leigh 🏳️‍⚧️@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    1 year ago

    How is anyone supposed to determine whether this was a good idea or not

    Ada’s judgment is not infallible, but I’d rather trust her judgment than go personally look for something she initially (and admitted mistakenly) thought was CSAM. There are two possible outcomes: (1) I see something that looks similar to CSAM to me and I feel gross about it, or (2) I don’t see any problem with the content, but it doesn’t change anything because she’s the admin here and is still unwilling to host copies of it on her server where she evaluates anything that gets reported.

    In either case, I can still enjoy content from LemmyNSFW elsewhere if I so choose — just not at Blahaj Zone.

    And this whole debate is literally declaring that legal adults don’t look right, and shouldn’t be allowed to post explicit images

    I think the two sides here are having different debates. Yes, there are legal adults who may appear underage, and they should have the same freedom any other adult has to post explicit pictures of themselves if they so choose. But a community that specifically encourages “child-like” content (as the community’s rules said at the time this decision was made) is going to gather multiple examples of this. Even if Ada fully trusts LemmyNSFW’s admins to 100% prevent any real CSAM from being federated, she’d still be exposed to reports of “potential CSAM” from there. She’s a community-building volunteer who willingly examines reported content that gets federated to Blahaj Zone, but she doesn’t want to view any more of it than is strictly necessary to protect her community. So she’s unwilling to federate with an instance that knowingly hosts such a community (even if the content is 100% legal) because it would cause more reports as time goes on. The content also upsets her on a personal level, which is fine — she’s a human being and is allowed to have feelings.

    Other admins at other instances might not have the same aversion to this specific type of legal content that Ada does, so maybe they don’t mind having it copied onto their servers. That’s cool. The Fediverse is great like that, users aren’t stuck with the decisions of any single person in charge. Ada announced her decision so that all we Blahaj Zone users would know about it, and if any of us feel strongly enough (and clearly a number of people do), we can vote with our feet and go use one of those other instances so we also don’t lose access to the communities we use here.

    This is my final comment on the matter. You may have the last word if you wish.

    • TopRamenBinLaden@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      If you browse all and sort by hot or popular on any of the Lemmy apps, posts from that community would pop up. It’s not some hidden community. I think a lot of people had already seen posts from there. I figured that it had to be some other community on there, as I never really saw anything that looked too suspect from the more popular posts that reached all. It’s petite pornstars.

      Nobody is a bad person for looking to see what the blahaj admin was talking about and verify for themselves, either. I think most people figured that there is obviously no CSAM on there considering the community is still up and running, and they probably wanted to see if their morals align with the admin here.

      You can’t just take someone’s word for truth on the internet these days.

    • Ryantific_theory@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      But a community that specifically encourages “child-like” content (as the community’s rules said at the time this decision was made) is going to gather multiple examples of this.

      This is part of why the whole debate is is blown out of proportion. The community was for posting images of “adorable” pornstars, a direct clone of the reddit community that’s one of the largest nsfw subreddits and has been for nearly a decade. The mod made the stumble of posting the dictionary definition of “adorable” on the sidebar, and can you guess what hyphenated word was a part of that? The idea that there’s even a “this type of content” to have an aversion to feels ridiculous after seeing the community.

      It’s not teen focused, nor attempting to simulate dubious content, it’s literally just pornstars looking cute. If the issue is gut-checking pornstars, the same thing is going to happen with the nsfw communities on this instance, barring a shift to milf-only posting instead of simply legal porn.

      At any rate, I appreciate the civil last word, even if we still disagree.