THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • Todd Bonzalez
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    The Stable Diffusion community tends to get rabid about anything that resembles government regulation of AI. The most popular Stable Diffusion textual inversion on CivitAI is an age-slider that goes young-younger-youngest, and half the content on the site is porn, if that’s any indication of what many of those users do with it.

    • JokeDeity
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      I guess I’m just confused about how this would effect them since they already have the tools on their personal devices and don’t need an internet connection to use them.