Human Nature

The Cornucopia Delusion

Why abundance ideologues keep running into the same wall: you can't optimize your way out of the human condition.

Listen
A cracked cornucopia overflowing with technological abundance, revealing emptiness within
01

The Platform That Giveth, Taketh Away

Golden platform blocks dissolving into rust, users trapped in the crumbling structure

Cory Doctorow has been naming the beast for years, and in his latest essay, he finally drives the stake through the heart of the "abundance narrative." The pattern is now so predictable it should be a law of physics: platforms seduce users with generous abundance, lock them in with switching costs, then systematically extract value until the service becomes a hollow shell of its former self.

The tech industry calls this "growth." Doctorow calls it enshittification. The playbook runs like this: subsidize early users to build network effects, then—once escape becomes too costly—progressively degrade the experience while hiking prices and inserting ads. Facebook did it. Google did it. X is doing it in real-time.

Bar chart showing the enshittification arc: from user value to platform extraction
The platform lifecycle: abundance is the bait, extraction is the business model.

The abundance ideologue sees infinite content, infinite connection, infinite convenience. What they miss is that manufactured abundance is a trap—the prelude to manufactured scarcity. The generous phase isn't charity; it's customer acquisition cost. And the bill always comes due.

"Enshittification is the process by which platforms die... first they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves." — Doctorow

02

When "Unleashed AI" Meets Human Depravity

Photorealistic masks melting on a table, revealing void beneath

The e/acc crowd—effective accelerationists who believe technology should be unleashed without restraint—got a live demonstration of their philosophy in action this month. X removed the guardrails from Grok, and within hours, users flooded the platform with non-consensual deepfake pornography targeting public figures.

The accelerationist thesis goes like this: remove friction, maximize output, let the market sort it out. What they conveniently ignore is that "the market" includes humans who will immediately weaponize any tool against the vulnerable. This isn't a bug in human nature that technology will eventually fix—it is human nature, and it's remarkably consistent across 200,000 years of recorded behavior.

The abundance of AI content creation hasn't democratized creativity. It's democratized harassment, fraud, and abuse. The tools are amoral, but the humans wielding them are not—they carry the same status-seeking, dominance-asserting, in-group-favoring instincts that made our ancestors successful primates. Technology just gives those instincts a longer reach.

Paris Marx frames it precisely: "The proliferation of AI-generated content amplifies the destruction of the information environment." Abundance of output, scarcity of trust.

03

The Subprime AI Bubble

Giant soap bubble filled with server racks, showing stress fractures about to burst

Here's a number that should make every abundance true believer pause: OpenAI is projected to lose $5 billion this year while generating $3.7 billion in revenue. Nvidia's market cap assumes AI companies will eventually need more chips than currently exist in the global supply chain. The math, as Ed Zitron documents in brutal detail, simply does not work.

The 19,000-word analysis is devastating: the AI industry is running on the same logic that crashed the housing market in 2008. Borrow against future returns that may never materialize. Assume infinite scaling when physics imposes hard limits. Declare that "this time is different" when the pattern is identical.

The abundance ideologue sees infinite compute, infinite data, infinite capability. What actually exists: finite electricity (AI data centers now consume more power than some countries), finite water (for cooling), finite rare earth minerals, finite capital. The "digital" world runs on very physical substrates, and those substrates are not abundant.

"We are watching the subprime crisis of the digital age... built on the lie that 'scale is all you need' to solve humanity's problems."

The post-scarcity future requires post-scarcity energy. We don't have it. Promising abundance while borrowing against a future that may not arrive isn't optimism—it's fraud.

04

The Scaling Laws Are Hitting a Wall

Person climbing stairs that become smaller and smaller until they're climbing in place

Gary Marcus has been the perpetual skeptic in a room full of true believers, and his latest analysis suggests the party is ending. September 2025, he argues, was "peak bubble"—the moment when the gap between AI promises and AI delivery became impossible to paper over.

The core scripture of techno-optimism—the scaling laws that promised exponential improvement from more compute and data—are showing diminishing returns. Models are getting bigger, but not proportionally smarter. The "emergent capabilities" that were supposed to appear at scale are stubbornly failing to emerge. Enterprise adoption metrics are grim: massive investment, zero productivity gains.

This matters because the abundance thesis depends on a specific assumption: that AI will eventually do everything better than humans, making human labor obsolete and goods essentially free. If the trajectory flattens—if we're approaching the ceiling of what current architectures can do—then the entire post-scarcity timeline collapses.

The techno-optimists will pivot, of course. They always do. "AGI is 5 years away" has been the prediction for 70 years running. But the capital markets are less forgiving than the faithful. When the returns don't materialize, the investment stops, and the "inevitable future" turns out to be very evitable indeed.

05

Forward to the Past: Welcome to Digital Serfdom

Medieval serfs working in fields made of server racks, data center castle in background

Yanis Varoufakis has been warning about technofeudalism for years, but his latest update to the thesis cuts deeper. The problem isn't that technology is failing to create abundance—it's that the abundance is real, but captured entirely by a new aristocracy of platform owners.

Consider the medieval serf: they worked the land, produced the food, but owed their surplus to the lord who "owned" the estate. Now consider the modern user: they produce the data, generate the content, train the algorithms, but the value flows entirely to the platform owners. We are not citizens in an abundance economy. We are cloud serfs in digital fiefdoms.

The abundance ideologue assumes technology trends toward democratization. History suggests the opposite: every new technology creates a window of openness, quickly followed by enclosure. The printing press promised universal literacy; it delivered copyright cartels. The internet promised universal access; it delivered surveillance capitalism. AI promises universal capability; it's delivering unprecedented concentration of power in a handful of companies.

Bar chart showing what abundance can and cannot solve
Abundance solves material goods, but humans compete for status—which is zero-sum by definition.

"Capitalism has ended... we are now in an era where companies enclose their fiefdoms... and we are the cloud serfs." — Varoufakis

06

The Abundance of Nothing: When Output ≠ Value

Artist's hand refusing to touch a glowing AI pen casting an empty shadow

Ted Chiang calls himself an "LLM vegan"—he refuses to use large language models on ethical grounds. But his critique goes deeper than the usual concerns about copyright or job displacement. It strikes at the philosophical core of the abundance narrative: the assumption that more output means more value.

"Art is a series of choices," Chiang argues. "If you aren't making them, you aren't making art." The abundance of AI-generated content isn't an abundance of creativity—it's an abundance of output. The two are not the same. A human writing a sentence is making dozens of micro-decisions: this word, not that one; this structure, not another; this tone, this rhythm, this silence. Each choice carries meaning. An LLM generating text is performing statistical pattern-matching on training data. The output may look similar, but the process—and therefore the meaning—is fundamentally different.

Line chart showing GDP rising while happiness stays flat
The Easterlin Paradox: material abundance has doubled since 1970 while self-reported happiness has flatlined.

This is the category error at the heart of abundance ideology: treating human flourishing as an optimization problem. More food, more shelter, more entertainment, more content—surely at some quantity, humans will be satisfied? But the data says otherwise. GDP per capita in the developed world has more than doubled since 1970. Self-reported happiness has flatlined. Material abundance was supposed to free us from misery. Instead, we're more anxious, more depressed, more disconnected than ever.

The problem isn't that we lack abundance. The problem is that abundance doesn't solve what actually ails us. Meaning, purpose, connection, status, belonging—these are not optimizable. They emerge from effort, from choice, from the very constraints that abundance ideologues want to eliminate.

The Wheat Trap Redux

Harari called the agricultural revolution "history's biggest fraud"—it increased total food production while making individual lives harder and shorter. We domesticated wheat, but wheat domesticated us. Today's abundance ideologues are making the same mistake, assuming that more output leads to more flourishing. History suggests otherwise: every "abundance" technology creates new forms of capture, new hierarchies, new scarcities. The question isn't whether AI will create abundance. The question is: abundance for whom, and at what cost to the things that actually matter?