• ContrarianTrail
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    5
    ·
    15 days ago

    Not knowing what it’s talking about is irrelevant if the answer is correct. Humans that knows what they’re talking about are just as prone to mistakes as an LLM is. Some could argue that in much more numerous ways too. I don’t see the way they work that different from each other as most other people here seem to.