‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • originalfrozenbanana
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    10
    ·
    10 months ago

    So it’s fine to violate someone’s privacy so long as you don’t share it? Weird morals you got there.

    • TrickDacy@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      edit-2
      10 months ago

      Am I violating privacy by picturing women naked?

      Because if it’s as cut and dried as you say, then the answer must be yes, and that’s flat out dumb

      I don’t see this as a privacy issue and I’m not sure how you’re squeezing it into that. I am not sure what it is, but you cannot violate a person’s privacy by imagining or coding an image of them. It’s weird creepy and because it can be mistaken for a real image, not proper to share.

      Can you actually stop clutching pearls for a moment to think this through a little better?

      • originalfrozenbanana
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        11
        ·
        10 months ago

        Sexualizing strangers isn’t a right or moral afforded to you by society. That’s a braindead take. You can ABSOLUTELY violate someone’s privacy by coding an image of them. That’s both a moral and legal question with an answer.

        Your comment is a self report.

        • TrickDacy@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          2
          ·
          10 months ago

          That’s a braindead take

          Projection. Since you have no room for new thoughts in your head, I consider this a block request.