← Back to Hub Next: Module 06 →
Week 2

AI Makes Sovereignty Industrial Again

The Production Function of AI Power

"The production function is back. AI models are not 'just code'—they are the output of an industrial pipeline requiring compute, energy, data, talent, and algorithmic efficiency."

Scroll to explore
01
The Paradigm Shift

From "Software" to "Sovereignty"

For decades, the tech industry celebrated the dematerialization of value creation. Software, we were told, "eats the world"—transforming heavy industry into lightweight code that could be copied infinitely at near-zero marginal cost.

AI disrupts this narrative. The production function—the economic relationship between inputs and outputs—has returned with a vengeance. Building frontier AI systems requires massive physical infrastructure: specialized semiconductors, gigawatts of electricity, curated datasets, and elite human talent.

Paul Romer's endogenous growth theory takes on new relevance: non-rival ideas (algorithms, architectures) still require rival inputs (chips, power, data centers) to generate value. The AI revolution is industrial in nature.

"Ideas are non-rival, but their production and deployment require rival inputs."

— Paul Romer, Endogenous Growth Theory

The Shift in Production

Traditional Software
Code Developer Laptop Low Capital
AI Systems
Compute Energy Data Talent $100M-500M
02
The Formula

The AI Production Function

A mathematical framework for understanding what it takes to build frontier AI capabilities

A = f(C, E, D) · T · η

Where AI capability is a function of five critical inputs

C Compute

Chips, clusters, data centers. The physical silicon that processes AI workloads.

Capital Moat
E Energy

Electricity, grid capacity, cooling. AI runs on power—lots of it.

Strategic Friction
D Data

Access, rights, freshness. High-quality training data is becoming enclosed.

Controlled Assets
T Talent

Researchers, engineers, architects. The human bottleneck in AI development.

Global Competition
η Efficiency (Eta)

Algorithmic innovations, architectural improvements, training optimizations. The counter-force that can multiply output without multiplying inputs. Mixture-of-Experts, distillation, and other techniques that challenge the capital-intensive paradigm.

The Disruptor
03
Variable C

Compute

The Capital Moat

Compute has become the primary gatekeeper of AI capability. Training a frontier model requires tens of thousands of specialized AI accelerators running in parallel for months—a capital expenditure that runs into the hundreds of millions of dollars.

The compute stack is dominated by two critical chokepoints: NVIDIA controls over 80% of the AI accelerator market, while TSMC manufactures approximately 90% of the world's most advanced logic chips. This creates a dual monopoly that shapes the geopolitical landscape of AI development.

>80%
NVIDIA Market Share
~90%
TSMC Advanced Chips
$100-500M
Training Run Cost
10K+
GPUs per Cluster

The Frontier System Stack

AI Accelerators
NVIDIA H100/H200, AMD MI300
HBM Memory
SK Hynix, Samsung, Micron
Networking
InfiniBand, NVLink, Ethernet
Cooling Systems
Liquid cooling, immersion

"Denial as strategy—restricting access to frontier chips has become a primary tool of AI geopolitics."

Big Tech Nuclear Pivot

Microsoft
Three Mile Island
20-year PPA $110-115/MWh
Google
Kairos Power
SMR Deployment 500 MW by 2035
Amazon
Susquehanna
Nuclear PPA Blocked by FERC
Stranded Intelligence

GPUs sitting idle due to power constraints—capital deployed but capability unrealized.

04
Variable E

Energy

"AI Runs on Electricity"

Power has emerged as the binding constraint on AI expansion. A single large training cluster can consume as much electricity as a small city. Data center power demand is projected to grow 160% by 2030, creating intense competition for grid capacity.

This scarcity has triggered grid moratoriums in key markets: Northern Virginia—the world's largest data center hub—has paused new connections. Ireland, another major hub, has imposed similar restrictions. The result is "stranded intelligence": GPUs deployed but unable to run at capacity due to power limitations.

Big Tech's response has been dramatic: a pivot to nuclear energy. Microsoft signed a 20-year power purchase agreement to restart Three Mile Island. Google partnered with Kairos Power for 500 MW of small modular reactors by 2035. These deals, at $110-115/MWh, represent a 3-4x premium over conventional power—testament to how critical energy has become.

Grid Moratoriums

Northern Virginia Ireland Singapore Amsterdam
05
Variable D

Data

From Free Scraping to Controlled Assets

The era of free, unrestricted data scraping is ending. High-quality training data—particularly code repositories, textbooks, and academic papers—has become a strategic resource subject to enclosure and monetization.

Platforms are erecting barriers: paywalls, API pricing hikes, and aggressive anti-scraping measures. Reddit signed exclusive licensing deals with Google and OpenAI worth $60-80 million annually. X (formerly Twitter) changed its privacy policy to explicitly allow AI training on user content.

Legal battles are reshaping the landscape. The New York Times' lawsuit against OpenAI challenges the foundational assumption that training on copyrighted content constitutes fair use. The outcome could fundamentally alter how AI companies acquire training data.

"Data is the new oil—but unlike oil, its value increases with refinement and exclusivity."

Data Enclosure Timeline

2023
Reddit API Changes
Pricing increases, access restrictions
2024
Reddit↔Google/OpenAI Deals
$60-80M annual licensing agreements
2024
X Privacy Policy Update
Explicit AI training permission
Ongoing
NYT vs OpenAI Lawsuit
Fair use boundaries being tested

High-Value Data Types

Code Textbooks Papers Dialogues

Global AI Talent Competition

China Produces 47%
of top AI researchers
US Hosts 57%
of top AI researchers
Source: MacroPolo Global AI Talent Tracker

The Visa Wars

US
H-1B Fee Increase
$100K+
CN
Global Excellent Scientists Fund
Aggressive recruitment
CA
Global Impact+ Program
Fast-track visas
06
Variable T

Talent

The Human Bottleneck

Despite the focus on compute and capital, humans remain the critical bottleneck. Elite AI researchers—those capable of architecting and training frontier models—number only in the low thousands globally.

The development cycle is long: 10-15 years from undergraduate to leading researcher. This creates a supply inelasticity that no amount of capital can immediately resolve.

MacroPolo's Global AI Talent Tracker reveals a striking pattern: China produces 47% of top AI researchers (based on undergraduate education), while the United States hosts 57% (based on current employment). This production-hosting asymmetry defines the talent competition.

The sovereignty dilemma is acute: countries must decide whether to prioritize capturing foreign talent (immigration) or controlling domestic talent (preventing emigration). The "visa wars" are intensifying, with the US raising H-1B fees, China launching aggressive recruitment funds, and Canada offering fast-track programs.

The 10-15 Year Cycle
No amount of capital can compress the time required to develop elite AI researchers.
07
Variable η

Efficiency (η)

The Counter-Force

Algorithmic innovations that multiply output without multiplying inputs—the great disruptor

THE DEEPSEEK SHOCK (2025)

Algorithmic Asymmetry

DeepSeek demonstrated that a $5.6 million training run could achieve capabilities comparable to models trained at $100-500 million. The efficiency gains came from architectural innovations, not capital expenditure.

$5.6M
DeepSeek
$100-500M
Frontier Models
Mixture-of-Experts (MoE)

Activate only relevant parameters per token, reducing compute by 3-5x

Distillation

Transfer knowledge from large to small models, maintaining capability

Training Optimizations

Better initialization, curriculum learning, data quality filtering

Does Efficiency Rewrite the Balance?

If algorithmic improvements continue at this pace, the capital-intensive paradigm may shift. But efficiency gains themselves require talent and research—creating a recursive dependency on the very inputs they seek to reduce.

08
Integration

Institutions & Capital

The organizational and financial infrastructure that binds all inputs together

High Barriers

Capital, expertise, and infrastructure requirements exclude most entrants

Cloud Coupling

Hyperscalers control the infrastructure; AI companies depend on them

State Policy

Export controls, subsidies, and industrial policy shape outcomes

Organizational Feat

Building frontier AI is as much management as science

"Building frontier AI is an organizational feat as much as a scientific one."

— The production function reminds us that ideas require infrastructure

Sources & References

MacroPolo Global AI Talent Tracker
Analysis of global AI researcher distribution
Paul Romer, Endogenous Growth Theory
Non-rival ideas and rival inputs framework
DeepSeek Technical Report (2025)
$5.6M training cost disclosure
Microsoft Three Mile Island PPA
$110-115/MWh nuclear power agreement
Reddit IPO Filing (2024)
$60-80M data licensing agreements
NYT v. OpenAI Court Documents
Fair use litigation in AI training