• deadbeef79000@lemmy.nz
    link
    fedilink
    English
    arrow-up
    8
    ·
    5 months ago

    That’s assuming the CEO isn’t already hallucinating.

    At least when an LLM hallucinates you can tell it and it won’t fire you.

    • TheObviousSolution
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 months ago

      It doesn’t have the power to do so. But it does have the power to shrug off your questions. Has an LLM ever shrugged off your questions?

      • deadbeef79000@lemmy.nz
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        5 months ago

        Sort of, I had GitHub Copilot hallucinate an AWS Cloud formation template stanza.

        Asked it for the source it used for the stanza, which it then gave me the URL for.

        Told it that the crap it just gave me wasn’t on that page.

        It apologies and told me to RTFM.

        So, yeah, even super auto correct is a dick.