I suspect that this is the direct result of AI generated content just overwhelming any real content.

I tried ddg, google, bing, quant, and none of them really help me find information I want these days.

Perplexity seems to work but I don’t like the idea of AI giving me “facts” since they are mostly based on other AI posts

  • ContrarianTrail
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    5
    ·
    4 hours ago

    LLMs have their flaws but for my use it’s usually good enough. It’s rarely mission critical information that I’m looking for. It satisfies my thirst for an answer and even if it’s wrong I’m probably going to forget it in a few hours anyway. If it’s something important I’ll start with chatGPT and then fact check it by looking up the information myself.

    • zarkanian@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 hours ago

      So, let me get this straight…you “thirst for an answer”, but you don’t care whether or not the answer is correct?

      • ContrarianTrail
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        Of course I care whether the answer is correct. My point was that even when it’s not, it doesn’t really matter much because if it were critical, I wouldn’t be asking ChatGPT in the first place. More often than not, the answer it gives me is correct. The occasional hallucination is a price I’m willing to pay for the huge convenience of having something like ChatGPT to quickly bounce ideas off of and ask about stuff.

        • zarkanian@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 hour ago

          I agree that AI can be helpful for bouncing ideas off of. It’s been a great aid in learning, too. However, when I’m using it to help me learn programming, for example, I can run the code and see whether or not it works.

          I’m automatically skeptical of anything they tell me, because I know they could just be making something up. I always have to verify.