Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.
Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?
https://github.com/hendkai/paperless_sort_low_quality_ollama let ai tag your paperless ngx files base on content.