Archived version

Since the release of automated “generative-AI” services—like ChatGPT, Gemini, DALL-E, Stable Diffusion, and Midjourney—the information superhighway has flooded under a deluge of machine-produced debris. Photos feature humans and animals with impossible anatomy, disjointed words reminiscent of text, illogical architecture. Stochastic parrots peck at whatever word or pixel is most likely to come next.

[…]

Google searches, such that they now are, turn up chaotic and irrational answers, pleasantly presented as if for our edification. Famously, a search for “African country starting with K” brought this unintelligible response (drawing from Google’s ingestion of an AI-summarizing website, Emergent Mind, itself relying on user posts at Hacker News): “While there are 54 recognized countries in Africa, none of them begin with the letter ‘K.’ The closest is Kenya, which starts with a ‘K’ sound, but is actually spelled with a ‘K’ sound."

[…]

Upon ingesting satirical clips from The Onion, McSweeney’s, and Reddit, Google search exhorted people to thicken tomato sauce with glue and eat their daily serving of rocks. The company that made its fortune helping us wade our way through the internet is now drowning in its own generated nonsense.

[…]

Large Language Models (LLMs) have invaded scientific papers, too. Succumbing to the need to “publish or perish,” authors and editors are doing us the favor of showing just how much of their papers are generated by ChatGPT, sometimes through their simple neglect to proofread before publication. A few even make up scientific images and figures. Meanwhile, AI-generated books are flooding Amazon. If you happen to buy a book on mushroom hunting written by machine, you’ll only know the statistical likeliness for the words in the description, not whether what you’re munching will actually kill you.

[…]

Instead of being organized around information, then, the contemporary internet is organized around content: exchangeable packets, unweighted by the truthfulness of their substance. Unlike knowledge, all content is flat. None is more or less justified in ascertaining true belief. None of it, at its core, is information.

As a result, our lives are consumed with the consumption of content, but we no longer know the truth when we see it. And when we don’t know how to weigh different truths, or to coordinate among different real-world experiences to look behind the veil, there is either cacophony or a single victor: a loudest voice that wins.

[…]

In the midst of global wars and propaganda campaigns—when it is more important than ever to be informed—the systems that bring us our “information” can’t measure or optimize what is true. They only care what we click on.

The nail in the coffin is what is currently sold to us as “Artificial Intelligence.” […] both [artificial and intelligence] are reduced to equally weighted packets of content, merely seeking an optimization function in a free marketplace of ideas. And both are equally ingested into a great statistical machinery […]

[…]

The result is a great torrent of tales “told by an idiot, full of sound and fury, / Signifying nothing.” The role of humans in this turbulence is not to attend with wisdom, but merely to contribute, and to consume.