ChatGPT generates cancer treatment plans that are full of errors — Study finds that ChatGPT provided false information when asked to design cancer treatment plans::Researchers at Brigham and Women’s Hospital found that cancer treatment plans generated by OpenAI’s revolutionary chatbot were full of errors.

  • Brandon658@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    People want to trust it as a source of quick knowledge. It is easier to be told 9 goes into 81 a total of 8 times trusting that the computer is always right, because it had access to everything, than to work out the answer given was wrong and is actually 9.

    Think of WebMD. People love to self diagnose despite it commonly being known as a bad practice. But they do so because they can do it with less effort, faster, and cheaper than making an appointment to drive to an office so you can speak with a doctor that runs a few tests and gets back to you in a week saying they aren’t sure and need to do that process all over again.

    • raptir
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      The healthcare issue is that I’m usually checking WebMD to see if what I’m experiencing is an actual issue that I need to go to the doctor for since it’s so expensive to go to a doctor.