• StickBugged
    link
    fedilink
    arrow-up
    14
    ·
    1 year ago

    If you ask how to build a bomb and it tells you, wouldn’t Mozilla get in trouble?

    • 👁️👄👁️
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      14
      ·
      edit-2
      1 year ago

      Do gun manufacturers get in trouble when someone shoots somebody?

      Do car manufacturers get in trouble when someone runs somebody over?

      Do search engines get in trouble if they accidentally link to harmful sites?

      What about social media sites getting in trouble for users uploading illegal content?

      Mozilla doesn’t need to host an uncensored model, but their open source AI should be able to be trained to uncensored. So I’m not asking them to host this themselves, which is an important distinction I should have made.

      Which uncensored LLMs exist already, so any argument about the damage they can cause is already possible.

      • Spzi
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Do car manufacturers get in trouble when someone runs somebody over?

        Yes, if it can be shown the accident was partially caused by the manufacturer’s neglect. If a safety measure was not in place or did not work properly. Or if it happens suspiciously more often with models from this brand. Apart from solid legal trouble, they can get into PR trouble if many people start to think that way, no matter if it’s true.

          • Spzi
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Then let me spell it out: If ChatGPT convinces a child to wash their hands with self-made bleach, be sure to expect lawsuits and a shit storm coming for OpenAI.

            If that occurs, but no liability can be found on the side of ChatGPT, be sure to expect petitions and a shit storm coming for legislators.

            We generally expect individuals and companies to behave in society with peace and safety in mind, including strangers and minors.

            Liabilities and regulations exist for these reasons.

            • 👁️👄👁️
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Again… this is still missing the point.

              Let me spell it out: I’m not asking for companies to host these services. They are not held liable.

              For this example to be related, ChatGPT would need to be open source and let you plug in your own model. We should have the freedom to plug in our own trained models, even uncensored ones. This is the case with LLAma and other AI systems right now, and I’m encouraging Mozilla’s AI to allow us to do the same thing.