An in-depth police report obtained by 404 Media shows how a school, and then the police, investigated a wave of AI-powered “nudify” apps in a high school.
An in-depth police report obtained by 404 Media shows how a school, and then the police, investigated a wave of AI-powered “nudify” apps in a high school.
Samsung’s AI does watermark their image at the exif, yes it is trivial for us to remove the exif, but it’s enough to catch these low effort bad actors.
I think all the main AI services watermark their images (invisibly, not in the metadata). A nudify service might not, I imagine.
I was rather wondering about the support for extensive surveillance.