No one is, picture the following.
You have a friend you talk to online, you’ve never met in real life but you talk online every day. Oneday that connection is severed, you never know it but a bot trained on all the data from your conversations is now talking to you instead, and the inverse has happened to your friend. You’re now both talking to bots of eachother designed to seem just like the real thing while slowly influencing you towards whatever mode of thinking is desireable by the owner of the platform.They don’t seem to be coping so well with a non-AI-saturated ‘post truth world’.
the white media is already a flood of disinfo, they just hate the competition
“In the same way you might use Google Maps to get everywhere and not know how to get there otherwise, AI might cause people to stop learning things they would have otherwise had to learn. Ironically, though, Rosen thinks this could cause more stress as people are inundated with AI and constantly shifting gears and not seeing anything quite clearly.”
This is one of the most concerning aspects of AI IMO. Learning and thinking are some of the most fundamental aspects of being human. When you can outsource thinking to a machine, how is that going to affect your sense of self worth? How are we going to keep kids motivated to learn in school, when they know that they’ll never be able to learn things as well as AI can?
I don’t understand why they’re trying to feed AI everything. There’s too much garbage on the internet and elsewhere. If they can’t make a good LLM with academic articles, Wikipedia, and non fiction books, newspapers, and magazines then they need to go back to the drawing board.
In the same way you might use Google Maps to get everywhere and not know how to get there otherwise, AI might cause people to stop learning things they would have otherwise had to learn.
The argument makes a lot of sense to me but this particular example somewhat falls flat. When I started my new job, I used Google Maps to learn the optimal route for a couple days until I had it committed to memory. Then I didn’t need it anymore.
For a place I am going to exactly once and then probably never again however, why would I want that information to take up valuable brain space? In a pre Google maps world, you spend probably twice as long taking a less optimal route that goes in that general direction, then driving around the area for a bit longer until you hopefully stumble upon what it is youre looking for.
This has nothing at all to do with AI, we’re already living in a world filled with misinformation. AI doesn’t fundamentally change anything. The reality is that people comevup with narratives they want to believe in, and then seek out information that fits in with those narratives.
We are living in a post truth world already, just check the news.