• ᗪᗩᗰᑎ
      link
      fedilink
      English
      207 months ago

      And a lot of those require models that are multiple Gigabytes in size that then need to be loaded into memory and are processed on a high end video card that would generate enough heat to ruin your phones battery if they could somehow shrink it to fit inside a phone. This just isn’t feasible on phones yet. Is it technically possible today? Yes, absolutely. Are the tradeoffs worth it? Not for the average person.

      • @diomnep@lemmynsfw.com
        link
        fedilink
        English
        1
        edit-2
        7 months ago

        “He’s off by multiple orders of magnitude, and he doesn’t even mention the resource that GenAI models require in large amounts (GPU), but he’s not wrong”