THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • JokeDeity
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    1 month ago

    Lol, yeah, makes sense. This bill that cannot possibly have real teeth to prevent this kind of thing from being done on people’s personal devices is exactly what they should spend time and money on.

    • Todd Bonzalez
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 month ago

      I think that’s a good thing. It falls short of resembling censorship that could be challenged on constitutional grounds, and simply opens the door for victims to sue people who publicly exploit their likeness with simulated porn without their consent.

      If you’re doing weird pervy shit with AI on your personal machine for private entertainment, that’s maybe somewhat reprehensible, but nobody is getting hurt if you’re not sharing it, and nobody is going to find out if you keep it to yourself. If you don’t publish or share the images, I can’t see how you would ever end up on the receiving end of a lawsuit over it.

      Post that weird shit online, or share it with others, and hurt the person you depicted; I don’t see why they shouldn’t be able to come after you for damages.

      • JokeDeity
        link
        fedilink
        arrow-up
        1
        ·
        1 month ago

        That’s kind of the point I’m making. 99.9% doesn’t hit the public eye, and the small percent that is in the bad category is going to be incredibly difficult to trace the source of if the author has even the slightest sense.