It seems crazy to me but ive seen this concept floated on several different post. There seems to be a number of users here that think there is some way AI generated CSAM will reduce Real life child victims.
Like the comments on this post here.
https://sh.itjust.works/post/6220815
I find this argument crazy. I don’t even know where to begin to talk about how many ways this will go wrong.
My views ( which are apprently not based in fact) are that AI CSAM is not really that different than “Actual” CSAM. It will still cause harm when viewing. And is still based in the further victimization of the children involved.
Further the ( ridiculous) idea that making it legal will some how reduce the number of predators by giving predators an outlet that doesnt involve real living victims, completely ignores the reality of the how AI Content is created.
Some have compared pedophilia and child sexual assault to a drug addiction. Which is dubious at best. And pretty offensive imo.
Using drugs has no inherent victim. And it is not predatory.
I could go on but im not an expert or a social worker of any kind.
Can anyone link me articles talking about this?
I would certainly condemn the killers. But you’re right, I feel a large segment of the online population wouldn’t.
Back in uni I had a roommate that was a celibate pedophile, great kid, brilliant programmer and always kind with a good sense of humor.
And a chronic alcoholic since the age of 14 as a coping mechanism.
None of us ever even knew until back in 2006 he went to school therapist to try to learn better coping mechanisms than getting blackout drunk every day at 7pm sharp.
She deemed him a threat and contacted FBI because apparently patient confidentiality in the U.S. doesn’t protect pedophiles. Since he had a niece he had never met (on purpose) on the other side of the continent, she felt validated in her actions.
They came and took him into custody. It wasn’t an arrest, just remanding to mental healthcare for evaluation against his will.
The officers picking him up were pretty loud about the fact they were escorting a pedophile. Made some coarse jokes about it as they walked him out. Took his computer of course.
A week later he comes back, broken AF. Calls together me and two other people he considered friends, and laid out the whole situation.
He had struggled with his desires his entire life, and went to monumental lengths to eliminate even the chance for contact. Never touched a kid, never used CSAM or even hentai. (I can confirm that as the fact his PC was clean of anything even remotely naughty was already a bit of a folklore legend around the dorm.) and vowed he would maintain this lifestyle forever.
He tried same age relationships, and some were ok but none lasted more than six or seven months.
Of course the psych eval and situational examination cleared him of any suspicion, but the damage had already been done and his parents picked him up that afternoon. If it wasn’t for campus security walking him out he would have been mobbed by the dozens of angry students that had heard the worst part of what happened.
The school even tried clearing his name later but it only made his memory more of a laughing stock.
We kept in touch for a few months, mainly through Steam as we were both avid gamers.
Then, one day he just stopped logging in. At the time I was too scared to call his family so I just waited. 16 years later he’s still offline.
Peyton I miss you man.
p.s. literally zero consequences for the therapist for ruining a bright kids life.
I’m so sorry, that’s such a sad story.
I appreciate it, makes me want to advocate more but then I become another target. It’s fucked up all around and the only thing I can say is we need better and more secure mental healthcare in this coutry.