title

  • NochMehrG@feddit.de
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    And it’s basically ended already. At least for ordinary people without a IT forensic team, the best advice is to be very sceptical towards images and videos, more or less so depending on the source.

    • apemint@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Photoshop has been around for over quarter of a century but you don’t need a forensic team to tell something has been photoshopped.
      Tools to detect image (and video) modifications have been around and will continue to be developed alongside these technologies. We’re simply entering a new era of media creation.

      When Photoshop became mainstream, people said the exact same thing, but somehow the world didn’t end up on its head.

        • secrethat@kbin.social
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          We just need to have our politicians painted in a way that is hard to replicate with current AI Generative technology

      • IWantToFuckSpez@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Photoshop still required skill to create a convincing fake. With AI it’s a lot easier for everyone without artistic skill to make deepfakes. Sure there are tools to detect these fakes. But it will get easier and easier to make a deepfake thus the social media feeds will be flooded with so many fakes in the future that damage will be done before the fakes can be debunked. Like how that altered video of Pelosi where she sounded drunk that went viral in the right wing sphere, that will happen exponentially more often in the future.

    • Itty53@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Grain of salt, rather than skepticism.

      No normal person can waste their time being truly skeptical of everything they see. The reality is we don’t have to (nor often do we) believe everything we see in normal situations. So take everything you see online, on TV, from afar with a grain of salt. Don’t put too much faith in it. Shit this is important even with actual real video footage, context can change everything. You get twenty seconds of a twenty minute video and you can make it say anything you want. Just cut the context to suit.

      In situations where evidence counts, such as a court room, custody of that evidence is considered. Any old mp4 can’t be provided to the court as evidence without it being thrown out by the other team. So there really, truly is hardly any concern whatsoever about generative AI in the courts.

      And the other side of that coin is confirmation bias. I don’t care how shitty the fake is, if you show a MAGA a video of Biden eating a baby that person will insist it’s real. Against any evidence to the contrary, they’ll argue its real and reality won’t matter.

      That’s what’s been meant by the “post truth world”. It isn’t a problem of establishing what is and isn’t truth. The problem is that the truth doesn’t matter anymore.

  • collegefurtrader@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Any photo that I didn’t personally capture is suspicious.

    Btw did you know that some phones automatically paste in a better looking fake moon if you take a pic of the moon at night?