• abhibeckert@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    edit-2
    7 months ago

    Intel is not fine on servers - ARM servers are about 20% faster for outright performance and 40% faster for performance-per-dollar. Since it’s literally just selecting a different option in a dropdown menu (assuming your software runs well on ARM, which it probably does these days), why would anyone choose Intel on a server?

    And they’re not fine no a laptops either - unplugged my ARM Mac from the charger seven hours ago… and I’m at 80% charge right now. Try that with an Intel laptop with an i9 Processor and a discrete NVIDIA GPU (those two would be needed to have similar performance).

    They’re only really doing well on desktop PCs, which is a small market, and people who can’t be bothered changing to a new architecture — a big market but one that is going away.

    • BearOfaTime
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      7 months ago

      When you say 20% faster - per what metric? Is that per watt power consumption, per dollar cost?

      If it’s per either of those, that’s pretty impressive, it’s a massive difference.