352
Lemmyshitpost community closed until further notice - Lemmy.world
lemmy.worldHello everyone, We unfortunately have to close the !lemmyshitpost community for
the time being. We have been fighting the CSAM (Child Sexual Assault Material)
posts all day but there is nothing we can do because they will just post from
another instance since we changed our registration policy. We keep working on a
solution, we have a few things in the works but that won’t help us now. Thank
you for your understanding and apologies to our users, moderators and admins of
other instances who had to deal with this. Edit: @Striker@lemmy.world
[https://lemmy.world/u/Striker] the moderator of the affected community made a
post apologizing for what happened. But this could not be stopped even with 10
moderators. And if it wasn’t his community it would have been another one. And
it is clear this could happen on any instance. The only thing that could have
prevented this is better moderation tools. And while a lot of the instance
admins have been asking for this, it doesn’t seem to be on the developers
roadmap for the time being. There are just two full-time developers on this
project and they seem to have other priorities. No offense to them but it
doesn’t inspire much faith for the future of Lemmy. But we will not give up. We
are lucky to have a very dedicated team and we can hopefully make an
announcement about what’s next very soon.
They also shut down registration
Whoever is spamming CP deserves the woodchipper
What was the burggit/vlemmy debacle?
I know that vlemmy suddenly disappeared with no warning.
Can’t speak to Burggit’s place in the saga, but its widely speculated the vlemmy admin found some CSAM in the data storage and shut the whole thing down so as not to be further legally liabile for illegal activity on the server. I’ve seen some people saying admins don’t have to worry about that because of this section of this code of that countries legal doctrine or whatever, but the reality a lot of us face is that law enforcement and prosecutors don’t care how the CSAM got there, or if you knew about it, because its your burden of proof to prove them wrong about that, and they just have to make jurors who don’t know how the tech works think its your hardware, your hosted service, and therefore, your CSAM. The consequences of mishandling or not documenting your actions in regards to CSAM are incredibly dire. You could see yourself permanently sent to prison, and if not, upon your release permanently ostracized in very complex ways that could render you permanently homeless.
I like… Don’t understand the stance that Ruud or VLemmy are overreacting at all. Those are the stakes in some places, including where the majority of instances are hosted. It gives me the read some people don’t care that the admins are just people like you and me hosting these services to make good communities happen. The expectation for some people seems to be just like… “Keep the service up no matter what. I want to view content. if it becomes impossible to host, just sail out into international water. The content must flow”
Though I haven’t confirmed it (nor would I want to at this point), from my recollection of events, some community from burggit.moe was allegedly the source instance of the problematic images that were thought to have taken vlemmy.net down, which was reportedly some sexual depiction of young fictional cartoon/animal characters. This might not be illegal in some places but it definitely is in Ireland, so the server owner was notified as such and had most of their online accounts removed.
My Fedilore+drama post on it