Contributor: The web is awash in AI slop. Real content is for subscribers only, and democracy suffers

Princess Diana tripping in the parkour park. Team USA takes gold at the Olympic Games in Bonga. Tankman Break dancing in Tiananmen Square. Kurt Cobain plays with poggs. Tupac Shakur looking for poutine at Costco. Open AI's Sora 2 artificial intelligence video generator debuted this month, and internet musicians have been pouncing on it. Cheerful and harmless? Or a symbol of who we are kiss reality goodbyecoming of age when no one can ever trust a video again?

This is the latest example of how AI is changing the world. But the problem goes deeper than just creating energy-sucking brains; this becomes a serious threat to democracy itself. Today, billions of people access the Internet not through high-quality news and information portals, but through algorithmically generated clickbait, disinformation and nonsense.

This phenomenon forms “sucks economy“: A second-tier internet where those who don't pay for content are inundated with low-quality, ad-optimized filth. Platforms like TikTok, Facebook and YouTube are filled with as much content as possible at minimal cost, produced by algorithmically scraping and remixing bits of human-written material into synthetic slurry. Bots create and distribute countless fake clickbait opinion blogs, how-to guides, and political memes. and a video about how to get rich quick.

Today, almost 75% new web content is at least partially generated by artificial intelligence, but this flow does not spread evenly throughout society. People who pay for high-quality news and information services enjoy trustworthy journalism and fact-checked reporting. But billions of users cannot afford paid content or simply prefer to rely on free platforms. In developing countries, the gap is clear: billions of people are going online for the first time. through cheap phones and heterogeneous networks, slop often becomes synonymous with the Internet itself.

This is important for democracy in two key respects. First, democracy depends on informed citizens with a base of facts and a population that can understand the issues that affect them. A lousy economy misleads voters, undermines trust in institutions, and increases polarization by amplifying sensationalist content. In addition to the widely discussed problem foreign disinformation campaignsThis insidious epidemic of slop affects many more people every day.

Second, people can become susceptible to extremism simply through prolonged exposure to slop. As users browse through different algorithmic feeds, we lose consensus on core truths as each party literally lives in its own information universe. It's a growing problem in the United States: AI-generated news is becoming so plentiful (and so realistic) that consumers believe it news about “pink slime” is more factual than real news sources.

Demagogues know this and exploit the world's poor who lack information. For example, AI-generated disinformation is already a widespread threat to election integrity in Africa and Asia, with deepfakes in South Africa, IndiaKenya and Namibia influence tens of millions of first-time voters using cheap phones and apps.

Why has slop taken over our digital world and what can we do about it? To find answers we surveyed 421 programmers and developers in Silicon Valley. who develop the algorithms and platforms that determine our information diet. We found a community of engaged technology insiders who are being held back from making positive change by market forces and corporate leaders.

The developers told us that the ideology of their bosses largely determines what they build. More than 80% said their CEO or founder's personal beliefs influence product design.

And it’s not just executives who put business success first, even above ethics and social responsibility. More than half of the developers we surveyed regret the negative social impact of their products, yet 74% would still create tools that restrict freedoms, such as surveillance platforms, even if it bothers them. Resistance in the corporate culture of tech companies is difficult.

This reveals a troubling synergy: business incentives are aligned with a culture of compliance, resulting in algorithms that favor divisive or low-value content because it drives engagement. The economy sucks because it's cheap and profitable to produce bad content. Addressing the waste problem must change business incentives.

Firms could filter out the slop by downgrading clickbait farms, clearly labeling AI-generated content, and removing obviously fake information. Search engines and social media should not treat a human-written investigation and a bot-written pseudo-news article as equals. There are already calls US And Europe Ensure quality standards are met for the algorithms that decide what we see.

Non-standard solutions are possible. One idea is to create public non-profit social networks. Just like you tune into public radio, you can tune into a public, AI-free social news feed that rivals TikTok scrolling but provides real news and educational snippets rather than conspiracies. And given this 22% of Gen Z hate artificial intelligenceThe billion-dollar private sector idea may just be a YouTube competitor that promises a complete ban on artificial intelligence, forever.

We can also protect slop producers by reducing the flow of advertising money that rewards content farms and spam sites. If ad networks refuse to fund sites with zero editorial standards, the flow of objectionable content will slow down. This has worked for disinformation about extremism: when platforms and payment systems stop receiving money, the volume of toxic content drops.

Our research gives a ray of hope. Most developers say they want to create products that strengthen democracy rather than undermine it. Reversing the lousy economy requires technology creators, consumers, and regulators working together to build a healthier digital public sphere. Lasting democracy, from local communities to the global stage, depends on closing the gap between those who receive facts and those who are fed bullshit. Let's end the digital waste before it eats away democracy as we know it.

Jason Miklian is a research professor at the University of Oslo in Norway. Christian Hölscher is a research professor at the Peace Research Institute in Oslo, Norway.

Leave a Comment