THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • Todd Bonzalez
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    3 months ago

    I think that’s a good thing. It falls short of resembling censorship that could be challenged on constitutional grounds, and simply opens the door for victims to sue people who publicly exploit their likeness with simulated porn without their consent.

    If you’re doing weird pervy shit with AI on your personal machine for private entertainment, that’s maybe somewhat reprehensible, but nobody is getting hurt if you’re not sharing it, and nobody is going to find out if you keep it to yourself. If you don’t publish or share the images, I can’t see how you would ever end up on the receiving end of a lawsuit over it.

    Post that weird shit online, or share it with others, and hurt the person you depicted; I don’t see why they shouldn’t be able to come after you for damages.

    • JokeDeity
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      That’s kind of the point I’m making. 99.9% doesn’t hit the public eye, and the small percent that is in the bad category is going to be incredibly difficult to trace the source of if the author has even the slightest sense.