š® The Sunday edition #508: Stargate & DeepSeek R-1; AI enzymes; kitchen fusion, photosynthesis in orbit & E(I)V ecosystems ++
An insiderās guide to AI and exponential technologies
Hi, itās Azeem.Ā
Two seismic shifts in AI this week show how fast the landscape is evolving. The US governmentās āStargateā, a $500 billion partnership with OpenAI to build massive AI infrastructure. Meanwhile, a small Chinese team has matched OpenAIās reasoning capabilities at 1/13th the cost with their open-source DeepSeek R-1 model.Ā
And, if that doesnāt get your head spinning, wait until you witness a 36-hour sprint to build a DIY fusion reactor in the kitchen or read about the successful creation of photosynthesis in orbit.Ā
We live in amazing times. Letās make the most of them.Ā
Stargate: A $500 billion bet
President Trump appeared with Sam Altman in the Oval Office to trumpet a massive new initiative called Stargate, a $500 billion plan to build AI infrastructure exclusively for OpenAI. Thatās an incredible advantage for the already leading AI company if it comes to fruition. The first $100 billion alone will fund a Texas data centre rumoured to house 700,000+ GPUs across multiple generations, powered by 1.8GW of energy. You can already see it being constructed below.Ā
Source: SemiAnalysis Datacenter Model - Oracle/Crusoe Abilene, Texas
Yet, there are significant question marks over financing for the deal. SoftBank, OpenAI, Oracle and MGX have all signed on, but their path towards funding the project is not entirely clear. Even Musk pointed this out, and trashed the project, much to the chagrin of the Trump staff. Microsoftās CEO, Satya Nadella, also questioned the financing in a comment that is simply baller: āIām good for my $80 billion.ā
Is $500bn a lot? During Railway Mania in the 19th century, 6-7% of Britainās GDP was poured into railway infrastructure each year, reaching a cumulative 40% of annual GDP. By that measure, Stargate is smallābut todayās economies are larger and more diversified than yesteryears. Rebuilding the American electrical grid would be a multitrillion-dollar outlay.
But in 2022, Amazon, Microsoft, Google and Meta dropped roughly $150bn on capex. This year, the four could clock in at $220-240bn and can afford to keep that investment rate up for years. We shouldnāt understate Stargate, but it isnāt the only game in town.
Seeking deep advantage
DeepSeek R-1, a new reasoning model from China, has matched the performance of OpenAIās o1 while being 13 times cheaper to run ā and itās fully open-source. As
said:DeepSeek R1 is one of the most amazing and impressive breakthroughs Iāve ever seen ā and as open source, a profound gift to the world.
It began as a side hustle for DeepSeek founder Liang Wenfengās quant-trading firm High-Flyer, tapping leftover GPU capacity. Their earlier V3 model was built in just two months on a $5,576,000 budget. Meta is no longer holding the top open-source spot and has gone into panic mode. OpenAI might also be uneasyā¦ Thereās currently little incentive to stick with o1.
While OpenAIās Sebastien Bubeck points out that these reasoning models are āextremely scalableā ā meaning more compute, more advantage ā open-source efforts have a unique power in innovation proliferation. When the entire world can access and tinker with these models ā seeing every secret in the research paper ā surprising breakthroughs will likely emerge at lightning speed.
There is a lot to parse here. DeepSeekās approach (model distillation, sparsity and other techniques) can be copied by other firms. And itās probably fair to say DeepSeek is a fast follower rather than a leader. It also shows the proliferation of perfomant models will continueā¦ with all the implications on business models and safety.
See also: Another Chinese company, ByteDance, which owns TikTok, released Doubao-1.5-pro, a model that matches GPT-4oās performance while being eight times cheaper.
Faster enzyme engineering
Keep reading with a 7-day free trial
Subscribe to Exponential View to keep reading this post and get 7 days of free access to the full post archives.