• mindlight
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    4
    ·
    7 months ago

    Microsoft has no choice.

    Arm has been dominating the biggest growing market mobile (everything from phones to tablets and now). Intel is fighting a three front war now. While one battlefront is the mobile market where ARM essentially is the only choice, another battlefront is dominated by Nvidia with the processors for graphics and ML/AI. If that wasn’t bad enough, AMD is attacking hard on Intel’s home arena: PC CPUs.

    When Apple dropped Intel for M1 they showed that Arm wasn’t just some niche processor technology for less powerful devices, such as mobile devices.

    So not only is AMD taking market shares in the PC market, ARM is on the rise and doesn’t look very good for Intel right now.

    Is Intel really capable of innovating their way out of their current path to extinction?

    • orclev@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      ·
      7 months ago

      Longer term it’s going to be interesting to see what if anything RISC-V changes. Right now they’re filling a role that ARM occupied about 20 years ago being primarily an alternative for cheap and medium power devices, but just like ARM they’ve got the potential to duke it out in the desktop space with the right backing. It would for instance be an interesting move if Microsoft partnered with a company like HiFive to produce a truly high end RISC-V CPU similar to Apples M1/M2.

      • Ugurcan@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        7 months ago

        Producing a really high end CPU just be muscle flexing. Anybody can do that. Having apps run on it is a whole another story.

        What Apple done right with M1 was not producing a powerful Arm CPU, but having old apps run on it so everyday people won’t be thrown into an unknown territory.

        I’m too, looking forward to RISCV’s expansion though. MS could just skip ARM and adopt the better platform.

        • orclev@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 months ago

          Producing a really high end CPU just be muscle flexing. Anybody can do that. Having apps run on it is a whole another story.

          You say that, but nobody has actually done so. HiFive has produced some CPUs that would qualify as extremely low end desktop CPUs, but nothing that can compete with even middle of the road processors like an i5 or a Ryzen 5. As for apps, it would be pretty trivial to get a huge swath of Linux apps running on it, and if there was enough of a base and demand you’d see companies producing RISC-V binaries as well (much like they’re starting to for ARM). For emulation layers I’m sure something could be done, QEMU if nothing else could probably be used.

      • XNX@slrpnk.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        Does RISC-V have the better power/heat management that ARM has? Would be interesting in intel goes all in RISC

        • orclev@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          7 months ago

          Yes, technically speaking ARM is RISC, just a different flavor of it from RISC-V. They’re effectively siblings. x86 on the other hand (and AMD64) are CISC processors. CISC provides compact programs at the cost of a more complicated (and therefore more power hungry) CPU. That said this is a gross oversimplification and no modern CPU is entirely RISC or CISC under the covers. Both ARM and x86 end up looking quite similar to each other when you dig into them, with x86 producing microcode from its instruction set that is effectively RISC, and ARM introducing some decidedly CISC looking instructions.

          The reality is the relative power hungry-ness of the architectures doesn’t really come down to RISC vs. CISC as much as it does x86 providing backwards compatibility to literally decades of bad decisions. If x86 could jettison backwards compatibility and ditch all but the latest and greatest of its instruction set it would be able to compete watt for watt with ARM easily, but that’s a tradeoff customers are unwilling to engage with as it would render large swaths of software incompatible.

    • aluminium@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      7 months ago

      Intel is fine. The fact that they are somewhat competetive on their dinosaur fabrication node is crazy by itsself.

      • abhibeckert@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        edit-2
        7 months ago

        Intel is not fine on servers - ARM servers are about 20% faster for outright performance and 40% faster for performance-per-dollar. Since it’s literally just selecting a different option in a dropdown menu (assuming your software runs well on ARM, which it probably does these days), why would anyone choose Intel on a server?

        And they’re not fine no a laptops either - unplugged my ARM Mac from the charger seven hours ago… and I’m at 80% charge right now. Try that with an Intel laptop with an i9 Processor and a discrete NVIDIA GPU (those two would be needed to have similar performance).

        They’re only really doing well on desktop PCs, which is a small market, and people who can’t be bothered changing to a new architecture — a big market but one that is going away.

        • BearOfaTime
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          7 months ago

          When you say 20% faster - per what metric? Is that per watt power consumption, per dollar cost?

          If it’s per either of those, that’s pretty impressive, it’s a massive difference.

    • intelisense
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      7 months ago

      You are forgetting cloud computing - all my workloads have moved to Graviton or will do very shortly.