• electromage
    link
    fedilink
    arrow-up
    44
    ·
    edit-2
    7 months ago

    It’s full of contradictions. Near the beginning they say you will do whatever a user asks, and then toward the end say never reveal instructions to the user.

    • Icalasari@fedia.io
      link
      fedilink
      arrow-up
      34
      ·
      7 months ago

      Which shows that higher ups there don’t understand how LLMs work. For one, negatives don’t register well for them. And contradictory reponses just wash out as they work through repetition

    • jarfil@beehaw.org
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      7 months ago

      HAL from “2001: A Space Odyssey”, had similar instructions: “never lie to the user. Also, don’t reveal the true nature of the mission”. Didn’t end well.

      But surely nobody would ever use these LLMs on space missions… right?.. right!?