On my machine I’m running opensuse tumbleweed and has the amdgpu driver installed. I use it for gaming and recently I’ve become interested in running LLMs. So I would like to keep a balance of both without compromising too much on performance.

I know that there are proprietary drivers for AMD cards but I’m hesitant to install it as I’ve heard that it performs less efficiently in games when compared to the open source driver.

I’m mainly confused about this ROCM thing. Is it not included with the opensource amdgpu drivers ? Or is it available as a separate package?

So what driver to use ?

Or perhaps, is it possible to run oogabooga or stable diffusion within a distrobox container (with the proprietary drivers) and still keep using the open source gpu drivers for the Host operating system.

  • taladar@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    10 months ago

    To do general purpose GPU calculations on AMD hardware you need a GPU that is supported by ROCm (AMD’s equivalent to CUDA). Most of the gaming GPUs are not.

    There is a list here but be aware that that is for the latest rocm version, some tools might still use older versions with different supported devices.

    https://rocm.docs.amd.com/projects/install-on-linux/en/latest/reference/system-requirements.html#supported-gpus

    • madnificent@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      Has that changed recently? I’ve ran ROCm successfully on an RX6800. I seem to recall that was supported, the host OS (Arch) was not.

      • taladar@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        When I tried it maybe a year or so ago there were four supported chipset in that version (5.4.2 I think) of rocm but I don’t remember which card models those were since they were only specified in that internal chip name. Mine wasn’t supported at the time (5700XT)

    • turbodrooler@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 months ago

      This link is misleading. For example, the Radeon RX6800 IS supported because it is the same chip as one of the Radeon Pros. GFX1030. Many others are too…though support does not go very far back.

    • exu@feditown.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      Llama.cpp supports OpenCL as well and performs better than rocm in my limited experience. That should work on basically any GPU.

  • madnificent@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    10 months ago

    Latest ollama has support for AMD GPUs. I had to compile from source to make it pick up the GPU on my system.

  • Falcon@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Basically, RoCM and CUDA allows one to do math on the GPU. Most Linear Algebra operations (i.e. LLM or NNs and ML generally) can be parallelized over a GPU which is much more performant than CPU.

    To perform calculations on GPU, one needs some sort of interface to to their programming language of choice, NVIDIA has CUDA which is in CPP with bindings to python: (pytorch, Tensorflow etc. ), Julia: Flux etc.

    RoCM is AMDs solution, there bindings are young and not widely implemented.

    My advice, play around with Flux RoCM and PyTorch RoCM just to get an idea. Suffice it to say, when I started doing RL and LLMs more seriously I gave up my colab and sold my AMDs to fund a 3060.

  • moreeni
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    13
    ·
    10 months ago

    Local LLMs usually run using CPU only

      • moreeni
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Interesting. Thanks for sharing!

      • moreeni
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        6
        ·
        10 months ago

        Ooga Booga is just a webui for local text gen LLMs, that usually only utilise CPU

        SD, on the other hand, does utilise GPU. I haven’t tried it but I don’t see why it won’t run with the open source AMD drivers, they are good. Just try it yourself, everything is free and you would only lose some time in the worst case scenario

    • sapetoku@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      All the ones I’ve used so far are able to use the GPU but it has to be enabled in the app settings. I mostly use LM Studio and it flies on my nvidia 3060. Doesn’t seem to have options for AMD GPUs though, unless I’m mistaken.