• ClockworkOtter@lemmy.world
    link
    fedilink
    arrow-up
    20
    ·
    edit-2
    5 months ago

    I wonder if the AI is detecting that the photo is taken from further away and below eye level which is more likely for a photo of a man, rather than looking at her facial characteristics?

    • drcobaltjedi@programming.dev
      link
      fedilink
      arrow-up
      17
      ·
      5 months ago

      Yeah, this is a valid point, if this is the exact case or not I don’t know, but a lot of people don’t realize a lot of the weird biases that can appear in the training data.

      Like that AI trained to detect ig a mole was cancer or not. A lot of the training data that was cancer had rulers in them. So the AI learned rulers are cancerous.

      I could easily see something stupid like angle the picture was taken from being something the AI erroniously assumed to be useful for determining biological sex in this case.

    • Tyoda
      link
      fedilink
      arrow-up
      17
      ·
      5 months ago

      It’s possible to manipulate an image in a way that the original and the new one are indistinguishable to the human eye, but the AI model gives completely different results.

      Like this helpful graphic I found

      Or… edit the HTML…