🔮 Sunday edition #538: Watts, not weights. The capex clock. Liberalism & human nature. Robots, walking & defense startups++
Hi all,
Welcome to our Sunday edition, where we explore the latest developments, ideas, and questions shaping the exponential economy.
Enjoy the weekend reading!
Azeem
The burn rate of a bubble
The generative AI investment boom is not immune to exuberance. It’s notoriously hard to get infrastructure spending right, especially during the installation phase of a new technology: over-capacity tangos with under-supply. During this boom, though, there seems to be no end in sight for ever larger commitments for ever larger datacenters.
But might capex growth outpace realised revenues? Back in February, I highlighted that quandary:
Morgan Stanley roughly estimates $520 billion by 2028. Assuming a 50% margin, it implies $1 trillion in new revenues that need to be found in that year, well above my threshold for disappointment.
What happens if revenues don’t show up to support those investments?
Turning to the question of depreciation this week, Praetorian Capital reckons that, covering depreciation on this year’s GPU cohort1 requires about $40 billion in profits. Depending on profit margins that could translate to triple or quadruple in revenues. Last year, hyperscalers booked just $45 billion in generative AI revenue.
Forecasts put AI revenue above $1 trillion by 2028. About $400 billion would come from enterprise productivity spending, while $680 billion would come from consumer wallets (across e-commerce spending, advertising, new apps and content).
But that is in the far future, at least outside the lifespan of the chips being installed today. It is hard to reconcile the math: these GPUs are consumables. They aren’t highways or cable ducts that can sweat for decades. They live on balance sheets, on what I believe are improbable six-year depreciation schedules. Feels a bit bubbly.
What changes the picture is that sober-minded infrastructure capital is starting to play in this build-out. Meta, for example, tapped PIMCO and Blue Owl for a $29 billion financing. These aren’t frivolous decision makers. So the bet must be that the borrower, Big Tech, is good for the money.
I suspect the Big Tech firms are motivated by the following logic: there is more than a possibility that demand will run hotter than even eager forecasts suggest. And that these massive data centers will increasingly convert to revenue as more consumers turn to paying apps, as genAI increases ad revenues through new inventory or better ROI and as business demand continues to grow.
Finally, the industry can see a huge and growing market out there. Under-provisioning their capacity means failing to serve customers at the quality they want. Their competitors are eagerly waiting.
That’s likely the rationale. We could all, of course, be wrong.
See also: All of this is making investors very jumpy. One example was a report from a minor MIT research group, which found that 95% of enterprise pilots fail to impact P&L. This spooked markets, so much so that even the FT covered it. But the research presents a punchy narrative built on wafer-thin ice. It relies on convenient, not representative, samples, a short-observation window and subjective meta-analyses followed by a final leap to sweeping prescriptions. Calmer heads, everyone!
The watts doctrine
A growing strand of Chinese AI strategy reframes the ambition as a physics problem: how efficiently can you convert energy into useful intelligence?
calls it the energy-compute theory – the idea that watts, not weights, will determine who dominates AI and the economy built around it. Professor Yu at Tsinghua University talks about moving from “race to AGI” to “ubiquitous edge intelligence”:To achieve this, Prof. Yu calls for merging intelligence capability and accessibility into a single objective function: maximizing inference energy efficiency subject to a given IQ threshold. That IQ constraint guarantees we are focused on what Prof. Yu calls “high quality […] tokens.”
Improvements in energy efficiency multiply. Better chips, smarter models, new architectures create polynomial advantages. China is using robotics and heterogeneous chip stacks as forcing functions to unlock these compound gains.
To contextualize, let’s recognise that American genAI firms also care about efficiency. Computation uses energy. Energy costs money. AI companies have incentives to reduce their costs: viz., OpenAI, whose GPT-5 has a router that steers easy tasks to cheaper models.
Google hasn’t stood still either. In the past year, it reduced the energy consumption of its LLMs by a factor of 33. The average prompt in Gemini now consumes 0.24Wh. To put this in context, if you watch an hour of Netflix on a big TV, you’ll use as much energy as 400 typical Gemini queries.
And the startup world is full of firms going all-in to improve energy’s intelligence cost, including some of the companies I’ve invested in: Vaire Computing, Fractile and Trainloop. All these will deliver increasingly high-quality and heavy AI workloads with greater efficiency.
The intelligence/energy exchange rate is far from fixed.
Liberalism and human nature
Modern strands of liberal thought often treat autonomy as its highest good, the promise of maximum choice, unbounded freedom and the ability to shape ourselves however we wish.
argues in a new essay that we’ve misunderstood liberalism’s core strength. It was never about boundless freedom but its recognition of limits – that our dignity and equality arise from a shared, enduring human nature.Liberals need to recognize that while autonomy is a human good, it is not the sole human good that trumps all other ends that people choose. Part of the human experience lies in coming to terms with limitations, both those that apply to us as individuals, and those that are true of human beings as a species.
This perspective becomes urgent across today’s debates, from biotechnology to AI. Consider genetic enhancement: altering the traits of future generations involves irreversible choices imposed on those who cannot consent. Parents may exercise exactly the kind of arbitrary power over others that liberalism was meant to restrain. As Jürgen Habermas warned in The Future of Human Nature, we risk creating new forms of domination disguised as liberation.
Elsewhere
In AI, tech & science:
How well can AI predict the future? There is now a benchmark for that.
While GPT-5 can do simple spatial tasks, it still fails on longer tasks that require more complex spatial reasoning.
Generative AI changes its ethical stance based on how you talk to it.
Yann LeCun suggests “submission to humans” and “empathy” as two fundamental AI guardrails.
China’s new 17-megawatt offshore wind turbine prototype is engineered to ride out waves more than 24 meters high. Floating platforms unlock deeper waters, where roughly 80% of global offshore wind potential sits beyond fixed-bottom depth limits.
Bacteria and viruses can be combined to overcome each other’s weaknesses as a cancer therapy.
Mind. Blown. A newly-discovered microbe with one of Earth’s tiniest genomes challenges our fundamental definition of what constitutes a living organism.
The World Humanoid Robot Games in Beijing were impressive:
And very fun to watch:
Markets and strategy:
Defence-tech’s new pitchbook: naval-drone unicorn Saronic is a test case for how frontier technologies are colliding with geopolitics. Its path to revenue is not the hockey-stick user growth but Pentagon procurement and government relations.
What does China actually want? An analysis of more than 12,000 official documents and speeches suggests it prioritizes domestic stability, economic development and territorial integrity.
In society & culture:
A city’s walkability increases people’s physical activity levels. Separately, structured exercise intervention can lead to clinically meaningful improvements in survival for colon cancer survivors.
- takes a critical look at the rationalist community.
Evidence suggests that the English Industrial Revolution was the primary catalyst for the breakdown of the traditional birthright-based social hierarchy.
Thanks for reading!
Azeem
Google, Amazon, Microsoft and Meta will spend $400 billion on datacenters in 2026 alone.