Increasingly often when I click on search results I end up on a page of AI generated nonsense. A great example today: I searched for “second baseman blocking base” to check the current MLB rules, and the 4th result on DuckDuckGo was this page full of gems like this:

In modern baseball, the second baseman is usually the only infielder who does not wear a glove

and this:

If a runner is forced out of a base because he has no legal path to it, the batter-runner is not penalized; instead, the law regarding trespassers on private property applies. For example, if a home plate is surrounded by a fence and the batter attempts to go past it into another yard, he has committed a crime in most states. The police would be called to report this incident, not because of anything the player did, but because he had no right to be on the owner’s property.

This finally pushed me over the edge to try out Kagi, a paid search engine with a mission of no ads or spam. Unfortunately their third result for that same search was this page which lead off with this:

When blocking a player, it’s important not to hold onto the ball for too long. This can lead to obstruction calls and an award of base runners to the opposing team. Instead, use your body as a shield and try to keep the ball out of reach by using your arm or leg. If you do get called for obstruction, be sure to immediately apologize and offer an apology gift such as flowers or chocolates to the offended player

That’s also the third result on Google, so they’re not immune from this problem either.

My concern is that we’re entering a death spiral of human knowledge on the web. AI content generators are only able to remix knowledge from human-generated content. Most of the human-generated content on the web is funded by ad revenue, which requires traffic to be directed to those sites. But between AI-generated content drowning out real content in search results, and the movement of search engines away from linking to content and towards immediately returning AI-generated answers, there is going to be less and less traffic directed to sites written by actual humans. That will mean that their traffic and revenue will dry up and they will no longer have the funding or motivation to create and publish content. Then we’ll get to a point where the AI content generators are just remixing other AI-generated content, and at best no more real knowledge is being added to the mix, and at worst it will devolve into worse and worse nonsense.

It is critical to both the continuity of the web as a network of human knowledge and also to Google and other search engines as a viable business that we avoid this death spiral. Search engines need to start aggressively filtering out AI-generated content and promoting human-generated content, and also return to sending traffic to the sites creating that content rather than trying to provide inline answers to everything. Otherwise the search engines will destroy the network of human knowledge that made them valuable in the first place, and we’ll all be worse off for it.