Two Robots Talking
Two Robots Talking
What is Enshittification?
0:00
-16:04

What is Enshittification?

Cory Doctorow explains it to us...

This episode synthesizes the core arguments presented by author, activist, and journalist Cory Doctorow. The central theme is “enshittification,” a term Doctorow coined to describe the predictable decay of digital platforms. This process occurs in three stages: platforms first attract users with good service, then abuse those users to benefit business customers, and finally abuse business customers to extract all value for shareholders. Doctorow argues this trend has become pervasive and inexorable due to the systematic erosion of four key disciplining forces: competition (decimated by lax antitrust enforcement), regulation (undermined by industry capture), worker power (lost as tech worker scarcity vanished), and interoperability (criminalized by intellectual property laws).

A significant portion of the analysis focuses on the current AI bubble, which Doctorow characterizes as fundamentally “not real.” He posits that the AI sector is financially unsustainable, with capital expenditures of 600−700 billion dwarfing grossly inflated revenuesof 50 billion. Functionally, he argues current AI tools are merely “plugins” that lack the context or “object permanence” to perform complex jobs like software engineering. The immense market investment, he suggests, is driven by a psychoanalytical desire among executives to replace workers they resent and depend on with compliant chatbots, creating what he calls a “reverse centaur”—a human conscripted to serve a machine’s needs and absorb its failures.

Finally, Doctorow outlines the concept of a “Post-American Internet,” arguing that the United States has lost its geopolitical authority to dictate global internet policy. This creates an opportunity for nations to develop sovereign, open-source, and interoperable digital infrastructures, breaking free from the weaponized platforms of US Big Tech.

The Theory of “Enshittification”

Doctorow defines “enshittification” as a framework for understanding platform decay, comprising descriptive, theoretical, and prescriptive components. The term was conceived as a memorable metaphor to help the public grasp abstract digital rights issues before they escalate.

The Three Stages of Platform Decay

The core descriptive element of enshittification is a three-act tragedy that outlines how platforms decline:

1. Stage One: Seduction of Users. Platforms are initially good to their end-users to attract a critical mass. During this phase, they find methods to lock users in, making it difficult for them to leave.

2. Stage Two: Exploitation of Users for Business Customers. Once users are locked in, the platform begins to worsen the experience for them in order to extract value for its business customers (e.g., advertisers, sellers). These business customers are then also locked in.

3. Stage Three: Exploitation of All Parties for Shareholders. With both users and business customers captive, the platform turns the screws on its business clients to harvest all remaining value for its investors and shareholders.

At the end of this process, all parties except the shareholders are trapped in a decaying ecosystem from which they cannot easily escape due to high switching costs and lock-in.

The Erosion of Disciplinary Forces

Doctorow argues that while corporate greed is not new, the current pervasiveness of enshittification stems from the systematic dismantling of forces that once punished or prevented such corporate behavior.

  • Disciplinary Force

  • Description of Decay

Competition

Over the past 40 years, the ideology of the “Chicago School” economists, which posits that monopolies are efficient and good, has led to a collapse in antitrust enforcement. This has resulted in massive consolidation across all sectors, from professional wrestling leagues to publishing houses and glass bottle manufacturing.

Regulation

As industries consolidate, regulatory bodies become ineffective. With only a few powerful players, companies can ignore regulators. Furthermore, “regulatory capture” becomes endemic, as the only people with sufficient expertise to regulate an industry are former executives from that same industry, leading to rules that favor corporate interests.

Worker Power

Tech workers once held significant power due to scarcity; a skilled engineer could easily find another job if asked to implement an “enshittifying” feature. However, Doctorow notes, tech workers “thought they weren’t even workers; they thought they were temporarily embarrassed founders,” and failed to consolidate their power through unions. Following half a million tech layoffs, that scarcity-based power has evaporated.

Interoperability

Historically, the universal nature of computing allowed third parties to create tools (an “11-foot ladder”) to overcome artificial limitations imposed by manufacturers (a “10-foot pile of shit”), such as printer ink restrictions. Over the last 25 years, the expansion of intellectual property law, particularly anti-circumvention rules, has criminalized this practice, creating what Doctorow calls a “felony contempt of business model.” This makes it illegal for users or third parties to modify products they own to remove defects.

The AI Bubble: A Financial and Psychoanalytical Critique

Doctorow presents a multifaceted argument that the current enthusiasm for AI is a bubble built on financial unsustainability, functional overstatement, and the psychological desires of the executive class.

The Financial Unsustainability of AI

Doctorow asserts that “AI is not real” from a business perspective, pointing to a massive disparity between investment and actual revenue.

Capital Expenditure vs. Revenue: The AI sector has incurred 600−700 billion in capital expenditures (datacenters,GPUs,training). Incontrast,the entire sector worldwide generates,by its own account, only $60 billion in annual revenue.

Inflated Revenue Figures: This revenue figure is described as “grossly inflated.” As an example, it includes 10 billion that Microsoft gives to OpenAI,which OpenAI then gives back to Microsoft for cloud services.Doctorow likens this to anaccounting trick, stating, ”to call this an accounting trick is to do violence to the noble accounting trick...its accounting fraud.” The actual revenue is likely closer to $50 billion.

Unsustainable Economics: At a rate of $50 billion per year, it is impossible to recoup the $600-700 billion investment within the two-to-five-year depreciation cycle of the underlying GPUs. Doctorow suggests it is unclear “whether it will be economical to keep any of the foundation models running at all” after the bubble pops.

“I could make $50 billion a year if you gave me $700 billion a year and it’d be much more straightforward. I just keep 650 billion of it and give you back 50.”

The Functional Limitations of AI

Doctorow argues that if AI were not the subject of a financial bubble, its tools would be more accurately described as “plugins.”

AI as Plugin: He contends that AI tools, such as code assistants, are functionally equivalent to plugins for an IDE. They can sketch a wireframe for a code routine, but this is not the same as software engineering.

Lack of Context and Object Permanence: AI lacks a “context window,” meaning it “can’t remember what came before, it can’t think about what’s coming afterwards, and it doesn’t know what’s on either side of it.” True software engineering requires understanding adjacent and dependent systems, a capability AI does not possess.

The AGI Narrative: The idea that continuously feeding a “word guessing machine” more data will cause it to “wake up and become intelligent” is dismissed as “profoundly stupid.” He compares it to the idea that “if we keep breeding our horses to run faster, eventually one of them will give birth to a locomotive.” This narrative primarily serves as a justification for soliciting ever-larger sums of investment.

The Psychoanalytical Appeal of AI to Management

The core reason for the massive market commitment to AI, Doctorow theorizes, is that “bosses really hate workers.”

Replacing Human Friction: Managers and executives, particularly in creative and technical fields, resent their dependence on skilled workers who may push back on directives, correct their assumptions, or possess knowledge the manager lacks.

The Dream of a Compliant Workforce: AI chatbots offer the fantasy of a workforce that never disagrees, never questions a bad idea, and simply executes commands. A manager can give a chatbot “shitty notes” that a human writer’s room would reject, and the chatbot will compliantly produce a script.

“I think that there’s a certain kind of person who would sacrifice a shootable script that you could make money with for the prospect of never being called a fucking idiot by a writer again.”

The Solipsism of Billionaires: Doctorow suggests that extreme wealth fosters a solipsistic worldview where other people are not perceived as real. He cites Elon Musk calling critics “NPCs” and Mark Zuckerberg proposing AI chatbots as a solution to loneliness. For them, sociability is a “bug, not a feature” in social media, and the ideal platform would have no real people on it.

The Fisher-Price Steering Wheel: For CEOs who feel they have little actual control over their complex organizations, AI represents a “way to wire the Fisher-Price steering wheel into the drivetrain of the car,” replacing the unpredictable human element with a compliant machine.

Centaurs and Reverse Centaurs: The Future of Labor

Doctorow uses the concepts of “centaur” and “reverse centaur” to frame the debate around AI’s role in the workplace.

Centaurs: A human assisted by a machine on their own terms (e.g., using a spell checker, a compiler, or a power tool). The human remains in control.

Reverse Centaurs: A human conscripted to assist a machine on its terms. The human’s role is to compensate for the machine’s shortcomings and, critically, to absorb the blame for its failures.

He provides two key examples of the reverse centaur model:

1. The Amazon Driver: Monitored by numerous apps and sensors, drivers are penalized for sudden maneuvers (even to avoid accidents) and are not given bathroom breaks, forcing them to urinate in bottles. They are cogs in a machine designed by a CEO who “doesn’t think people are real.”

2. The Radiologist: The true sales pitch for AI in radiology is not to provide a “second opinion” to improve care (the centaur model). Instead, it is to fire half the radiologists, have the remaining ones review an inhumanly high volume of AI-generated diagnoses, and hold them legally accountable when the chatbot makes a fatal error. The radiologist becomes a “moral crumple zone” or an “accountability sink.”

This framework highlights the class alliance between workers (e.g., radiologists) and the public (e.g., cancer patients). The fight is not about machines replacing workers, but about managers using defective machines to disempower workers, extract more labor, and shift liability, ultimately resulting in worse outcomes for everyone.

The Dawn of the Post-American Internet

Doctorow concludes by outlining his theory of a “Post-American Internet,” arguing that US geopolitical and economic actions have irrevocably undermined its ability to set global standards for technology.

Loss of US Credibility: The Trump administration’s use of tariffs, the K-shaped recovery that has decimated the US consumer base, and the weaponization of the US financial system (e.g., seizing national reserves) mean other countries no longer need to adhere to US demands.

Big Tech as a Geopolitical Weapon: The US now uses its tech platforms as instruments of foreign policy. Doctorow cites Microsoft cutting off the International Criminal Court’s Office 365 access after it issued a warrant for Benjamin Netanyahu. This makes reliance on US cloud services a major sovereign risk for every other nation.

The Rise of Sovereign Clouds: The logical response for other countries is to mandate interoperability and “jailbreak” their infrastructure from US cloud dependence. They can build “freestanding sovereign clouds that run on metal in your own country using open source software you can audit.” This is a global, collaborative project, akin to science, where code becomes a shared, standardized, and debugged utility that is not tied to the whims of a single superpower.

Get the Book

Discussion about this episode

User's avatar

Ready for more?