yeah, self hosted may be a bit too much for everyone, but they should at least make its training database open as ai is biased on whatever data it is trained on
eg. like how some smart taps won’t work for black people as the company just didn’t trained the sensors to work with dark skin
imo, nextcloud took the best approach here, allowing users to utilize chatgpt 4 if needed, while still making a totally in-house FLOSS option available
Because it’s just unnecessary. Due to their nature, you want a few services reachable from anywhere, anyways. There’s no reason for the average consumer to acquire hardware for this purpose. Just rent the service or the hardware elsewhere, which also reduces upfront cost which is ideal in situations where you cannot know whether you’ll stick with the service.
Again, it’s either extreme that’s absurd. You don’t need your own video streaming platform for example. In rare cases, sure. For the vast majority of people, Netflix is a much service however.
hard disagree on that one, the opposite is true. we end up with companies centralizing it on huge datacenters and not even being able to profit from it (services like youtube are unprofitable). best solution would be a federated service. I digress though because video platforms are a completely different beast.
something as personal like ai assistants should utilize the processing power i already have available, wasteful not to.
also its a BAD idea to hand out data for something so personal to google yet again. lets not keep repeating that mistake if we can avoid it.
Newer Pixels are having hardware chips dedicated to AI in them, which could be able to run these locally. Apple is planning on doing local LLMs too. There’s been a lot of development on “small LLMs”, which have a ton of benefits, like being able to study LLMs easier, run them on lower specs, and saving power on LLM usage.
Smaller LLMs have huge performance tradeoffs, most notably in their abilities to obey prompts. Bard has billions of parameters, so mobile chips wouldn’t be able to run it.
That’s right now, small LLMs have been the focus of development just very recently. And judging how fast LLMs have been improving, I can see that changing very soon.
and it’s also about the way they pretend that, because they’re processing data on device, it’s somehow safe from them. No, they’re processing data on device to do federated learning (or otherwise use the processed data in ways you still prefer they just not do).
a self-hosted solution that you can ensure doesn’t.
Being self-hosted in no way, shape, or form ensures that it doesn’t spy on you. You’re still putting trust in a third-party to keep their promises. The average user lacks the know-how to audit code. Hell, the average user wouldn’t be able to figure out self-hosting in the first place.
that’s really it. Lots of apps find lots of ways to call home, and Google, especially, is constantly calling home from Android, so unless you’re going to, like… uninstall all but one Google app to test it in a vacuum, and then add other apps one at a time, it’s not going to work. Also, that experiment won’t work, because we already know that Google Play Services handles most of these shenanigans.
Did you write the driver for the keyboard you wrote that on? Silly and completely unrealistic take. The world relies on trust to operate. It’s not black and white.
No you shouldn’t. Google has enough data already. If it is not self hosted it can’t be trusted.
The idea that you should fly with exclusively self-hosted approaches is equally absurd to the idea that you should just blindly trust everyone.
Plus, if they have, as you say, “enough” data already, then surely giving them more doesn’t actually hurt you in any way, shape or form?
yeah, self hosted may be a bit too much for everyone, but they should at least make its training database open as ai is biased on whatever data it is trained on
eg. like how some smart taps won’t work for black people as the company just didn’t trained the sensors to work with dark skin
imo, nextcloud took the best approach here, allowing users to utilize chatgpt 4 if needed, while still making a totally in-house FLOSS option available
why not?
what is so absurd about code running in an users own device?
Because it’s just unnecessary. Due to their nature, you want a few services reachable from anywhere, anyways. There’s no reason for the average consumer to acquire hardware for this purpose. Just rent the service or the hardware elsewhere, which also reduces upfront cost which is ideal in situations where you cannot know whether you’ll stick with the service.
Again, it’s either extreme that’s absurd. You don’t need your own video streaming platform for example. In rare cases, sure. For the vast majority of people, Netflix is a much service however.
hard disagree on that one, the opposite is true. we end up with companies centralizing it on huge datacenters and not even being able to profit from it (services like youtube are unprofitable). best solution would be a federated service. I digress though because video platforms are a completely different beast.
something as personal like ai assistants should utilize the processing power i already have available, wasteful not to.
also its a BAD idea to hand out data for something so personal to google yet again. lets not keep repeating that mistake if we can avoid it.
I would love to self-host something like that. But I do not have a good enough GPU to do something like that
Newer Pixels are having hardware chips dedicated to AI in them, which could be able to run these locally. Apple is planning on doing local LLMs too. There’s been a lot of development on “small LLMs”, which have a ton of benefits, like being able to study LLMs easier, run them on lower specs, and saving power on LLM usage.
Smaller LLMs have huge performance tradeoffs, most notably in their abilities to obey prompts. Bard has billions of parameters, so mobile chips wouldn’t be able to run it.
That’s right now, small LLMs have been the focus of development just very recently. And judging how fast LLMs have been improving, I can see that changing very soon.
deleted by creator
Ridiculous take.
There’s a vast difference between using a cloud service that definitely spies on you, and a self-hosted solution that you can ensure doesn’t.
The ridiculous take is the joke:
deleted by creator
In this case it’s less about “spying” and more about data being used for training.
and advertising.
and it’s also about the way they pretend that, because they’re processing data on device, it’s somehow safe from them. No, they’re processing data on device to do federated learning (or otherwise use the processed data in ways you still prefer they just not do).
Being self-hosted in no way, shape, or form ensures that it doesn’t spy on you. You’re still putting trust in a third-party to keep their promises. The average user lacks the know-how to audit code. Hell, the average user wouldn’t be able to figure out self-hosting in the first place.
You don’t have to audit code to ensure it doesn’t call home.
Okay, what can the average user do to ensure this, then?
disable your internet connection.
that’s really it. Lots of apps find lots of ways to call home, and Google, especially, is constantly calling home from Android, so unless you’re going to, like… uninstall all but one Google app to test it in a vacuum, and then add other apps one at a time, it’s not going to work. Also, that experiment won’t work, because we already know that Google Play Services handles most of these shenanigans.
We’re talking about a service that intrinsically requires an internet connection, though.
yes, that’s my point, serdan is being silly, you’re right.
It’s actually quite easy to see if an app is phoning home. Also easy to prevent.
Lol, how do you prevent a Google app from phoning home without preventing all Google apps, including GPS, from accessing the internet at all?
Did you write the driver for the keyboard you wrote that on? Silly and completely unrealistic take. The world relies on trust to operate. It’s not black and white.
deleted by creator