A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • cley_faye@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    3 months ago

    I’m not familiar with the US laws, but… isn’t it already some form of crime or something to distribute nude of someone without their consent? This should not change whether AI is involved or not.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      3 months ago

      It might depend on whether fabricating them wholesale would be considered a nude or not. Legally, it could be considered a different person if you’re making it, since the “nude” is someone else, and you’re putting their face on top, or it’s a complete fabrication made by a computer.

      Unclear if it would still count if it was someone else and they were lying about it being the victim, for example, pretending a headless mirror-nude was sent by the victim, when it was sent by someone else.