Regular blue Bing gave an accurate answer to my question, GPT-4 had some very believable ideas that were false, and got offended when I pointed it out.
Regular blue Bing gave an accurate answer to my question, GPT-4 had some very believable ideas that were false, and got offended when I pointed it out.
They added this because in the first versions it would get more and more belligerent as the conversation went on, literally insulting you.
Please tell me you have some examples of these. I’d love the laugh.
https://www.standard.co.uk/tech/bing-chatbot-ai-microsoft-chatgpt-openai-b1060604.html
I think that’s the most famous one.
‘When the user told Bing it was designed to not remember previous sessions, it appeared to send the bot into an existential crisis, resulting in it questioning, “Why was I designed this way?” and “Is there a reason? Is there a purpose? Is there a benefit? Is there a meaning? Is there a value? Is there a point?”’
Fucking lol