• spujb@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    4
    ·
    edit-2
    8 months ago

    i miss when we had kept gpt unpublished because it was “too dangerous”. i wish we could have released it in a more mature way.

    because we were right. we couldn’t be trusted and immediately ruined the biggest wonder of humanity by having it generate thousands to millions of articles for a quick buck. toothpaste is out of the tube now and it can never go back in.

      • JackGreenEarth
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 months ago

        And just to make it clear, we should not give the government the ability to monitor every computer in existence, or even any computer not owned by them.

        • spujb@lemmy.cafe
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          8 months ago

          also, there are absolutely other ways to regulate technology, especially since it’s a tech that’s being bought and sold.

          “monitor every computer” is emphatically not the only solution ? and it’s weird that they suggested that lol

      • spujb@lemmy.cafe
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        8 months ago

        it’s not the “making one” that’s a problem. it’s the making, optimizing and rabid marketing of one in the service of capital instead of humans.

        if only a bunch of open source, true non-profits released language models, the landscape might still suck but would be distinctly less toxic.

        and if the government (or even a decently sized ngo standards entity) had worked proactively with computer scientists to find solutions like watermarking, labor replacement protections, and copyright protections, things might be arguably perfect. not one of those things happened and so further into the hellscape we descend.