🟣 EV Daily: Anthropic demands a power surge ⚡
Plus: Google eats the web | Amazon buys Bee 🐝 | UN goes green 🌱 | Ng on framing AI tasks
Lead story: ⚡ Anthropic demands a power surge
Anthropic estimates that the US will need to build 50 gigawatts of new power capacity by 2028 to meet the demand for AI. That’s nearly twice New York State’s peak summer load, or the output of 33 nuclear reactors. The warning comes as OpenAI and Oracle have just announced new Stargate AI data centers that will consume 4.5 gigawatts of power themselves. Leopold Aschenbrenner, who wrote the Situational Awareness paper in 2024, projected that the largest training cluster in 2028 would be about 10 gigawatts. It probably won’t reach quite that level, since most of the capacity will go to inference and training will be split among multiple providers and clusters. Nevertheless, these developments point to the emergence of some enormous clusters. [Anthropic, Tom’s Hardware, Situational Awareness]
Key signals, quick scan
A 30-second scan of four secondary signals that hint at where the curve is bending.
🗣️ ChatGPT has more than 500 million weekly active users, who send more than 2.5 billion prompts every day, 330 million of which come from the US. Sam Altman is revealing the rare user data as he tours Washington DC to make the case for hands-off regulation. [Axios]
⛔ Google’s AI overviews cut external clicks by 7 percentage points and raise session endings by 10 points compared with standard search results, a Pew study shows. This is yet more quantitative proof that generative summaries can cannibalize the open web, eroding the traffic economics on which advertising, e-commerce, and media depend. [Pew Research Center]
🐝 Amazon has acquired Bee, which makes a $49 AI-assistant wrist device, for an undisclosed sum. The purchase signals a renewed push for voice-and-vision AI wearables priced for mass adoption, an alternative path to the smartphone. [TechCrunch]
♻️ UN secretary-general António Guterres wants AI data centers to use 100% renewable energy by 2030. Diplomatic pressure turns the carbon intensity of AI compute from a tech‑industry debate to a multilateral climate priority. [Bloomberg]
Future of work: Valuing judgement
As generative AI accelerates coding and research, the bottleneck shifts to determining what is worth building. Andrew Ng calls this the ‘product‑management bottleneck.’ He says winning teams have product managers who pair deep user empathy with a fast, data‑refined gut instinct. They make most decisions quickly and slow down only when fresh evidence contradicts their mental model.
Another bottleneck is ensuring that AI-generated content meets expectations. We recently saw the downside when our AI fact-checking process failed. The broader lesson is clear: progress in any AI‑driven workflow hinges on two equally critical abilities – framing high‑value tasks and rigorously validating outputs – now that AI can handle the generation. [Andrew Ng]
Was this email useful?
Thanks! Want to share this with someone like you? Forward the email.