• Kbin_space_program@kbin.social
      link
      fedilink
      arrow-up
      48
      ·
      edit-2
      1 year ago

      There’s one I saw yesterday where the prompt was a Bugatti and some other supercar in a steampunk style.

      The AI just threw back a silver and gold Arkham series Batmobile.

      Edit: Because it’s fun, I tested Bing’s image generator. They auto block on the word Disney, (e.g. a mouse in the style of Walt Disney)
      But they allow “a mouse in the style of Steamboat Willie.”

      Also I think Dall-e is being used to make prodigious amounts of porn. Pretty much anything I tried with the word “woman” gets content blocked in Bing. “Woman eating fried chicken” is blocked. Not blocked for “Man eating fried chicken.”

      • captainlezbian@lemmy.world
        link
        fedilink
        arrow-up
        21
        ·
        1 year ago

        Oh nice all women get to experience what lesbians have been for a while. Welcome sisters to being treated as inherently pornographic, you don’t get used to it

        • Kbin_space_program@kbin.social
          link
          fedilink
          arrow-up
          7
          ·
          edit-2
          1 year ago

          So I played around some more. If I used the term “woman” I had to add that they were clothed or add the clothing they were wearing, for one of them I had to add fully clothed and specify “a full suit”.

          I went back over the one time it worked previously and they were nude, it only passed because it put the scene in silhouette, and that apparently enabled it to get passed the sensors.

          But it had absolutely no issue reproducing Iron Man and Ultron in a two word prompt and the absolute scariest is that it can make reproductions of big celebrities.

          • captainlezbian@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            Yeah that’s the thing, it’s not even surprising. Men are socially treated as the default in our culture and especially by tech people (who are overwhelmingly men and surrounded by other men in social and professional contexts). The cultural sexualization of women showing in llms is exactly what one would expect to happen because when people are looking to use it to create a person doing a thing they’re lookin for that thing, same for a man, but for women it’s often for porn. And I would be shocked if that wasn’t a problem google had to actively combat early on.

            In short, more tech people need to read feminist theory as it relates to what they’re making