• Rhaedas@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Yes, I know better, but ask a kid that and perhaps they’d do it. A LLM isn’t thinking though, it’s repeating training through probabilities. And btw, yes, humans can be misaligned with each other, having self goals underneath common ones. Humans think though…well, most of them.