For me it’s detailed describtions about people’s dreams.
Not only doesn’t your story make any sense, but you’re also telling me about something that didn’t even happen. It’s kind of like telling about an event, and then ending the story by saying you just made it all up, except with dreams you begin by telling it’s all made up. I’m already not interested before you even started.
When it comes to contradictions and hallucinations, sometimes people will put out contradictory information or act hypocritical, the difference is intention. ChatGPT can’t help but make contradictions and hallucinations because it has no awareness of what information it’s putting forth. It will very convincingly present incorrect information and not know that’s what it’s done.
It is impressive. Except it’s being touted as more impressive than it actually is and that’s what annoys me. The complete lack of critically evaluating these models and then giving into survivorship bias.
That’s all aside from the privacy and copyright concerns.