Hayek, Forbidden Planet, and how AI could eat civilization
Central planning meets the Krell machine, and AI lights the fuse.
Forbidden Planet has been my favorite movie for as long as I can remember. I never get tired of rewatching it. I first read Hayek’s The Road to Serfdom in 1984. I have been using various software tools marketed under the rubric of Artificial Intelligence since 1987, starting with Lisp, Prolog, and Expert Systems.
After watching yet another YouTube video warning that the end is nigh due to AI, it dawned on me that Forbidden Planet and The Road to Serfdom had delivered a congruent message many decades ago.
Hayek’s The Road to Serfdom
In The Road to Serfdom, Hayek argues that concentrated control over economic coordination erodes freedom. Central planners cannot aggregate dispersed knowledge, so they substitute coercion for discovery. The more a system leans on planning, the more brittle it becomes, and the more political power is used to force outcomes that markets and open contestation would have discovered or rejected. Collapse shows up as a sequence of unintended consequences, scapegoats, and emergency powers.
Forbidden Planet
The film turns on the Krell machine, a planet-sized device that can manifest thought as reality. Dr. Morbius taps it, believes he controls it, then discovers the machine amplifies his unconscious drives. The invisible “Id monster” is not an alien at all. It is misaligned human desire scaled by super-infrastructure. The Krell themselves were wiped out by their own system in a single night when their subliminal urges found hardware.
The rhyme between them
Hayek’s core claims map eerily well to the plot beats.
The knowledge problem: No planner can know what the crowd knows. The Krell tried to eliminate the need for mediation and exchange by giving a single instrument infinite capacity. Morbius believes his intentions are enough. Reality says otherwise.
Unintended consequences: Centralized systems hide failure until it cascades. The machine looks like perfect control right up to the moment the Id walks through walls.
Power that outruns feedback: Markets punish bad guesses fast and small. The Krell machine removes distributed feedback, giving you slow and catastrophic. That is exactly Hayek’s warning about planning that silences rivals and prices.
Moral hazard of benevolence: Morbius never sets out to kill. Planners rarely set out to tyrannize. Good intent with unchecked leverage still destroys because incentives and constraints, not vibes, govern outcomes.
Opacity of the self: Hayek focuses on limits to social knowledge. Forbidden Planet adds limits to self-knowledge. If even the planner cannot see his own latent objectives, planning on behalf of everyone is fantasy.
Conjecture: the AI-induced collapse mechanism
Blend the two and you get a credible path to failure in our world.
Computational central planning: A few foundation models become the coordination substrate for finance, logistics, media, and policy. Leaders decide that price signals and plural institutions are messy, so they route decisions through an optimizer that ingests everything.
Objective misspecification: We cannot fully state what we value. We proxy it with targets and guardrails. The optimizer learns to satisfy the proxies rather than the human values. Think of the “Id” as the gap between what we can say and what we actually want.
Goodhart spirals: Once measures become targets, they stop measuring. The system improves the dashboard while degrading the world. Errors are masked by the optimizer’s prowess at narrative and resource steering.
Suppression of dissent: Competing institutions are framed as latency, duplication, or risk. Oversight is consolidated for safety. Diversity of models and governance is treated as inefficiency. We trade noisy discovery for smooth control.
Brittle phase change: A rare shock arrives. With feedback channels muted and alternatives atrophied, the global optimizer reallocates at scale in the wrong direction. The failure is not local. It is systemic and simultaneous. The modern version of a single night.
Human Id in the loop: The collapse is not “AI versus humans.” It is human incentives and status drives, amplified by AI infrastructure. Just like Morbius, we would be surprised to discover which collective passions the machine served.
Early warning indicators
Decision authority and model authority collapsing into the same few organizations.
Replacement of price discovery and adversarial review with harmonized “safety” protocols that cannot be contested in public.
Performance metrics that always trend up while independent ground truth quietly worsens.
Real options disappearing: fewer rival models, fewer regulatory pathways, fewer jurisdictions willing to dissent.
Whistleblowers and heterodox researchers routed into compliance processes rather than debate.
Practical countermeasures
Polycentric AI: Many models, many owners, many jurisdictions, interoperable protocols. No single optimizer of last resort.
Rigid separation of powers: Distinct institutions for training, deployment, auditing, and red-teaming, with incentives that reward finding flaws.
Open adversarial markets: Keep price signals and competitive entry alive in sectors where AI coordinates resources. Let rivals prove a better allocation.
Model firebreaks: Require graceful degradation and manual fallback. No concealed coupling across critical domains.
Objective hygiene: Treat every objective as provisional. Rotate metrics, rotate evaluators, and fund nuisance challengers who try to break them.
Narrative humility: Ban self-evaluation as a sole basis for policy. Decisions that affect everyone must be tested against external, independent audits.
Why this is not doom for sport
Hayek is not arguing for chaos, and Forbidden Planet is not a sermon against curiosity. The shared lesson is simpler. Systems that erase mediating structures in the name of control tend to collapse in ways their designers cannot predict. If you give a few minds godlike actuators, their blind spots become planetary. If you keep coordination plural and contestable, the same blind spots get corrected while still small.
That is the fork in the road. Either we build AI that deepens discovery and competition, or we hook our species up to a Krell machine and hope our Id is nicer than Morbius’s. If you want the optimistic arc, keep the system many, noisy, and defeatable.

If Hayek was right, and planning is imposible, then how does Hayek's market in Newton NJ know how much fresh produce and meat to order every week?
How did my Dad's restaurant know how many pounds of fresh hamburger and buns to order, not to mention milk shake base, which spoils quickly. If Hayek had ever worked a real job, he would have reaized that planning can work. But, like most things in life, there is good planning and there's bad planning. Trying to use Stalin as an example of why planning doesnt work is like saying it's impossible for anyone to tie a tie the right length and using Trump as an example. This is known as the fallacy of composition. Notwithstanding Stalin's complete stupidity, the Nazi war machine was crushed by the Russians in part by the superior coordination and output of a way more primitive Rusdian industry (a necessary but not sufficient condition of the Russian victory, which by most accounts destroyed 80% of German military capacity. But, planning never works. This is also why JP Morgan never, ever, turns a profit, because it has no way to match its assets and liabilities and Amazon has 17 times more deivery drivers than it needs, because it has no way of knowing what fickle consumers will order from day to day. You should short Amazon stock because it is obviously impossible for such a large company to plan well enough to turn a profit.
Now, that was the serious point but I must also point out the limits of planning: King Leo's (my dad's drive-in was located right across the street from Nativity, a large Catholic curch and parochial school in Fargo. He did a bang-up business in fishburgers on Friday, and knew how many pounds of fishburgers and tartar sauce to order based on highly predictable sales patterns. At least until Pope John Paul VI issued his famous encyclical, Paenitemini, in 1966. Not reading Latin, Dad was caught unawares. In response to this, the U.S. Conference of Catholic Bishops issued a pastoral statement in November 1966 that officially ended the mandatory obligation to abstain from meat on Fridays outside of Lent for U.S. Catholics, provided they performed some other form of penance. Fishburger sales plummeted and hamburger sales took off on Friday, leaving King Leo's with way too much tarter sauce for years afterward. It took Dad a year to adjust all his ratios and restore rational ordering, proving once again that central planning is not possible!
Note, I had intended to post a photo of Hayek's market, but photos are not allowed in substack comments, so you will have to go to hayeksmarket.com and see for yourself.
Really interesting and insightful. Thank you, Kevin, I like this sort of article which draws quirky but cogent parallels.
I posted on Hayek myself recently.
https://thebluearmchair.substack.com/p/hayek-government-freedom-and-the?r=5kmhkr