I want to use SearxNG as my daily driver, and I added three instances to my browser’s search engines list. However, I find that all three are down whenever I try to use it, and I inevitably have to look at the list of instances and click the top one just to perform one search. Is there a way to “auto-route” my search through the most reliable instance or something?
My solution to this was to run my own private instance. It’s very easy with docker-compose.
Doesn’t this defeat the privacy of searx because it’s coming from a single IP and a single user?
Does that help with the engine connection errors?
I self-host my own SearXNG instance and haven’t had any errors like that in the ~6 months it’s been running. It’s just anecdotal evidence, but I would imagine any such low-traffic instance would be able to avoid the blocking and other issues that high-traffic instances run into.
I’ve never thought of it like that. Another service for the homelab to run.
I still have some issues with Bing from time to time but as far as I can tell that’s because they are changing stuff and the SearXNG devs just need to push a fix.
I believe Bing is making changes to accomodate Chat GPT
Can that be done on a mobile device?
You could in theory but hosting it on dedicated hardware will be a much better experience.
Searx.space has a whole bunch of instances. 92 online right now.
This is why I don’t use searxng as my daily driver! I’m too lazy to self host and they’re always down.
I use search.sapti.me as my instance and I’ve never had an issue with it
Mobile or desktop? Libredirect on desktop allows you to enable a ton of pre-populated Searx instances in the Search category, so it will use any of the enabled instances in your list.
I don’t believe it picks based on reliability but it at least saves you the time of manually picking/editing your instances. If one is troublesome, just remove it from the list.
Problem is that this kind of alternative web interfaces for Google are just going web scraping of the regular www.google.com page. They are not using the Google Search API (which is paid and requires an API key). However, Google says scraping of their search results is not allowed. And they are actually preventing it by blocking IPs doing too many search request in certain amount of time. That is the main reason a lot of Searx or SearxNG instances stop working after a certain time.
i feel this too, i tried using searx for a while and found it really nice but frequently down or broken :/ i’ll try some of the instances suggested here.