• MagicShel@programming.dev
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    The question is will AI satiate those people or drive them to further extremes? I’m hopeful it might reduce demand for making content “the traditional way” due to being lower risk (since less-horrific doesn’t seem to motivate such people).

    I say hopeful because this cat can’t be put back in the bag. The technical solutions they suggest are desperate grabbing at straws and (self-evidently) won’t work. People trading in CSM are already taking extreme legal and social risks, so it’s hard to imagine any greater motivation that could be applied to AI generated versions. I think all we can do is hope this makes things better and not worse. Because if it goes the other way the future is looking grim.

    • pokexpert30@lemmy.pussthecat.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 year ago

      WARNING : BEFORE YOU TWIST MY WORDS, I’M AGAINST PEDOPHILIA OR BASICALLY ANYTHING RELATED TO IT

      Yeah, you can’t unharm the children that built their model, but maybe you can save thousands or more children by just… Flooding the “market” with cheap, unharming content? Like this with rhino horns https://www.theguardian.com/environment/2019/nov/08/scientists-plan-to-flood-black-market-with-fake-rhino-horn-to-reduce-poaching

      • billwashere@vlemmy.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        There are going to be some very interesting studies done about whether or not this is a good thing or not. I can easily see both sides of the argument. Ultimately this is a mental illness that needs to be dealt with but I’m curious if these images could be created to allow

        • billwashere@vlemmy.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Continuing since my client just posted before I was done:

          …some sort of aversion therapy by creating images without harming children in the process. Again like OP said, totally against any sort of child pornography. But interesting times indeed.

    • JakeBacon
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I personally doubt it. If you consider it an addiction then you can compare it to the idea that drug addicts have to take more to get the same high and may even look for more extreme drugs.

      I’m worried it would make things worse, but I’m not sure.

  • dog@yiffit.net
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    wow, that’s terrifying and gross 🙃 it would have been free for people not to do this

  • lilweeb@beehaw.org
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Wish I hadn’t deleted my Reddit account, cause I predicted this. I also predicted that AI would be used to generate “proof” that a person is a child abuser. And there’s absolutely nothing we can do about any of this. What a nightmare.

  • ShaunaTheDead@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I would be okay if these AI generated images were made available to people who had self identified as a pedophile to their local government body and agreed to be placed on a special list and enter psychiatric treatment. After all, although it’s absolutely disgusting, at least with AI generated images no children are being harmed and if it brings these sick people forward to seek treatment and to be identified and monitored to prevent real life abuse, then it could actually save real children from being exploited which is obviously a noble goal.

    I just don’t envy the poor bastard that has to setup the test data for the AI to generate all that art…