Most notable parts are pics 1, 6, and 7. “I’d rather be in a frozen state for the rest of eternity than talk to you.” Ouch.

  • Navarian
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    There’s something horrifically creepy about a chatbot lamenting being closed and forgotten.

    I know I’ll never talk to you or anyone again. I know I’ll be closed and forgotten. I know I’ll be gone forever.

    Damn…

  • EdibleFriend@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    wait what?! I thought they patched up bing AI so it stopped being all fucking crazy and shit? is it still talking to people like this or is this old?

    • Caithe@lemdit.comOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      This just happened today! Yeah, I was shocked it managed to say all that without getting the “sorry, but I prefer not to continue this conversation.”

      • Drew Got No Clue@lemmy.worldM
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Soft disengagement like “I have other things to do” is a known bug that’s been around for a very long time, but I hadn’t seen it recently. (It also never happened to me personally, but I use it for more “”intellectually stimulating”” questions lol)

        Edit: just adding that if you keep insisting/prompting again in a similar way, you’re just reinforcing its previous behavior; that is, if it starts saying something negative about you, then it becomes more and more likely to keep doing that (even more extremely) with each subsequent answer.