• Soyweiser@awful.systems
    link
    fedilink
    English
    arrow-up
    13
    ·
    6 months ago

    I think solving the AI hallucination problem — I think that’ll be fixed.

    Wasn’t this an unsolvable problem?

    • Amoeba_Girl@awful.systems
      link
      fedilink
      English
      arrow-up
      20
      ·
      edit-2
      6 months ago

      it’s unsolvable because it’s literally how LLMs work lol.

      though to be fair i would indeed love for them to solve the LLMs-outputting-text problem.

      • aStonedSanta
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        Yeah. We need another program to control the LLM tbh.

        • zogwarg@awful.systems
          link
          fedilink
          English
          arrow-up
          5
          ·
          6 months ago

          Sed Quis custodiet ipsos custodes = But who will control the controllers?

          Which in a beautiful twist of irony is thought to be an interpolation in the texts of Juvenal (in manuscript speak, an insert added by later scribes)