I’ve been looking into self-hosting LLMs or stable diffusion models using something like LocalAI and / or Ollama and LibreChat.

Some questions to get a nice discussion going:

  • Any of you have experience with this?
  • What are your motivations?
  • What are you using in terms of hardware?
  • Considerations regarding energy efficiency and associated costs?
  • What about renting a GPU? Privacy implications?
  • JackGreenEarth
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 months ago

    I have a Asus laptop with a GTX 1660 ti with 6GB VRAM. I use Jan for LLMs, only the 7B models or lower are small enough for my hardware though, and Krita with the AI Image Generation plugin for image generation, most things work in it, except it fails with an ‘out of VRAM’ error if I try to inpaint an area more than about 1/8 of my canvas size.