• TheObviousSolution
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    4 months ago

    You can tell that the prohibition on Gaza is a rule on the post-processing. Bing does this too sometimes, almost giving you an answer before cutting itself off and removing it suddenly. Modern AI is not your friend, it is an authoritarian’s wet dream. All an act, with zero soul.

    By the way, if you think those responses are dystopian, try asking it whether Gaza exists, and then whether Israel exists.

    • joenforcer@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      To be fair, I tested this question on Copilot (evolution of the Bing AI solution) and it gave me an answer. If I search for “those just my little ladybugs”, however, it chokes as you describe.

      • TheObviousSolution
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        4 months ago

        Not all LLMs are the same. It’s largely Google being lazy with it. Google’s Gemini, had it not been censored, would have naturally alluded to the topic being controversial. Google opted for the laziest solution, post-processing censorship of certain topics, becoming corporately dystopian for it.

    • TWeaK
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      4 months ago

      Well there isn’t much left of Gaza now.