A month ago, we released our framework for assessing whether AI is a bubble. The framework uses five key gauges which measure various industry stressors and whether they are in a safe, cautious or danger zone. These zones have been backâtested against several previous boomâandâbust cycles. As a reminder, we track:
Economic strain (capex as a share of GDP)
Industry strain (investment relative to revenue)
Revenue momentum (doubling time in years)
Valuation heat (Nasdaqâ100 P/E ratio)
Funding quality (strength of funding sources)
The framework has circulated through boardrooms, investment memos, and policy circles â and today, weâre taking it a step further.
We are launching v1 of a live dashboard, updated in real time as new data comes in. Paying members will have first access for the initial 24 hours; members can access it below. After that, weâll release it to the general public.
In todayâs post, weâll introduce paying members to the dashboard and share an update on the changes we have observed since our September essay.
A month later, whatâs different?
Economic strain
The economic strain gauge measures how much of the US economy is being consumed by AI infrastructure spend. We look at AIârelated capital expenditure in the US as a share of US GDP. This gauge is green if capex/GDP is below 1%; amber at 1â2%; and red once it crosses 2%. Historically, the three American railroad busts of the 19th century all exceeded 3%. The ratio was roughly 1% during the late 1990s telecoms expansion and a little higher during the dotcom bubble.
Since we last updated the dashboard, economic strain has increased but remains in safe territory. Google, Microsoft and Meta increased their collective capex by 11% compared to the previous quarter. Hyperscalersâ spending on AI infrastructure shows no sign of slowing.
Industry strain
Revenue is one of the key metrics we track to judge whether AI is a boom or a bubble. It feeds into two of our five gauges: revenue momentum (revenue doubling time) and industry strain, which measures whether revenue is keeping pace with investment. Investment usually comes before revenue. Itâs a sign of optimism. But that optimism must be grounded in results: real customers spending real money.
This was one of the most challenging pieces of analysis to assemble, as reliable revenue data in the generativeâAI sector remains scarce and fragmented. Most companies disclose little detail, and what does exist is often inflated, duplicated or buried within broader cloud and software lines. Our model tackles this by tracing only deâduplicated revenue: money actually changing hands for generativeâAI products and services. That means triangulating filings, disclosures and secondary datasets to isolate the signal. The result is a conservative but more realistic picture of the sectorâs underlying economics. The simplified Sankey diagram below shows how we think about those flows.
Consumers and businesses pay for generativeâAI services, including chatbots, productivity tools such as Fyxer or Granola and direct API access.
Thirdâparty apps may rely on models from Anthropic, Google and others, or host their own.
Big tech firms, particularly Google and Meta, deploy generative AI internally to improve ad performance and productivity, blending proprietary and thirdâparty systems.
In this simplified public version, we group hyperscalers and neoclouds together and collapse smaller cost categories into âOther.â Flow sizes here are illustrative, but our full model tracks them precisely. (Get in touch if you want institutional access to our revenue data and modeling.)
Back in September, we estimated that revenue covered about oneâsixth of the proposed industry capex. Our historical modeling put this in deep amber territory. We have now updated our models with an improved methodology and more recent data, and the results have changed the look of our dashboard.




