• ExtremeDullard@lemmy.sdf.org
    link
    fedilink
    arrow-up
    50
    arrow-down
    2
    ·
    9 months ago

    And this is a surprise how?

    The entire digital economy is based on spying. It’s called corporate surveillance and it’s been around for 25 years. Why would AI escape this business model? If anything, it turbocharges it.

  • Optional@lemmy.world
    link
    fedilink
    arrow-up
    34
    arrow-down
    3
    ·
    9 months ago

    Wait wait wait. Hold on. Okay. Okay. Wait.

    Wait so - Microsoft?? Has been spying? On its own customers?!?

    I just . . . I mean, it . . . I don’t know what to say!

  • catloaf
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    4
    ·
    edit-2
    9 months ago

    Microsoft looks at the data you send them? God forbid.

    • pdxfed@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      9 months ago

      Reassuring that the 35 phishing emails I report to them a day from my Hotmail junker are going to be addressed.

  • grandel@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    9 months ago

    Isn’t that their business model? How else can windows be offered for “free”?

  • sub_ubi@lemmy.ml
    link
    fedilink
    arrow-up
    6
    arrow-down
    7
    ·
    9 months ago

    As a bad Python scripter, I’m stuck using Microsoft’s AI because there isn’t a privacy-focused alternative anywhere near as good.

    • swordsmanluke@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      9 months ago

      It’s not as good, but running small LLMs locally can work. I’ve been messing around with ollama, which makes it drop dead simple to try out different models locally.

      You won’t be running any model as powerful as ChatGPT - but for quick “stack overflow replacement” style of questions I find it’s usually good enough.

      And before you write off the idea of local models completely, some recent studies indicate that our current models could be made orders of magnitude smaller for the same level of capability. Think Moore’s law but for shrinking the required connections within a model. I do believe we’ll be able to run GPT3.5-level models on consumer grade hardware in the very near future. (Of course, by then GPT-7 may be running the world but we live in hope).

    • Landless2029@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      6
      ·
      edit-2
      9 months ago

      Check out github copilot

      Not free. But it’s cheap paid and supposedly privatey focused.

      “If you’re not paying for a product, you are the product.” Shame usually it’s both!

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        9
        ·
        9 months ago

        Github copilot pirates other peoples code. Legally that’s hard to pursue buts its enough to make me dislike them.

        • Landless2029@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          9 months ago

          Oh their model is 100% taken from public repositories. I doubt they bothered to even filter it out to open source/fair use code.

          My issue here is AI isn’t going to replace my job, but an engineer who uses AI as a tool would replace me…

        • Landless2029@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          9 months ago

          I’m aware. I looked into it regarding your source code being used to train their ML. I looked over the FAQ and got the “Your code is your own.” vibe. Sadly it does point to their standard Privacy statement that could change anytime and allow them to do what they want.

          Will my private code be shared with other users?

          No. We follow responsible practices in accordance with our Privacy Statement to ensure that your code snippets will not be used as suggested code for other users of GitHub Copilot.