🔮 Repetitive forecasts; banned AI; new materials; wireless charging, octopus embodied ++ #451
An insider's guide to AI and exponential technologies
Hi,
It’s
here with our regular Sunday edition.👋 It’s wonderful to see many hundreds of new subscribers who joined in the past week alone. Welcome! To give you some background, the Sunday edition is part analysis, part curation — it is our way of taking a step back, assessing the most relevant events of the week shaping our exponential age, and putting them in context for you. If you are new to my work and this newsletter, Bryan Walsh’s profile of me is a great place to start to understand where I come from in my analysis.
And if you’ve been reading Exponential View for a while — and benefitted from our perspective, take a moment to give that gift of insight to your network by sharing this edition with at least 1 person.
Latest posts
If you’re not a subscriber, here’s what you missed recently:
🐙 AI is in business — I caught up with Andrew Ng
Since ChatGPT was released a year ago, organisations across the world have been grappling with the rise of generative AI. What have they learnt, and what are their most common mistakes? Are we at an inflection point in the AI revolution, and how can regulation support its development?
Sunday chart: The sun is rising
BloombergNEF (BNEF) released its updated solar installation forecast for 2023, projecting 413GW of new capacity. Up 64% from 2022 and quite clearly on an exponential. Great news!
Historical projections for 2023 have consistently underestimated this growth. A January 2022 forecast from the same analysts projected only 236GW in installations, falling 42% short of the current expectation.
BloombergNEF is one of the top energy forecasters and we think of them as a go-to resource, especially for historical data. So why were they so wrong?
Jenny Chase, a BNEF solar analyst, said that “[f]orecasting solar build is hard when people actually pay for the results and therefore want them country by country. It’s easy when you just extrapolate a global line, but that is not terribly useful for setting corporate strategy, and makes your clients yell at you”.
They build forecasts from the ground up. It seems they mostly misjudged the Chinese market which experienced explosive growth, with a reported 240GW of solar installations in 2023, 58% of all global installations. In addition, Chinese makers have reduced solar prices from $0.23 to $0.14 per watt, a 39% fall since the year began. Increased deployment and price decline help create a virtuous cycle, as we said in September:
This rapid growth means solar costs will likely continue decreasing due to ongoing learning effects. Plummeting costs will increase demand even further, creating a virtuous cycle. Yet while the past has been exponential, future forecasts are still resoundingly linear. These forecasts were wrong before, and they’ll likely be wrong again. What was Einstein’s definition of insanity again?
Key reads
Forgiveness, not permission. Salesforce’s study of 14,000 workers in 14 countries reveals that over half use generative AI at work without official permission, and 40% use banned AI tools. The tools are too useful for many desk-bound tasks, so it’s not surprising that workers are turning to them. For the CIO, it reminds me of the old adage, “The internet views censorship as damage and routes around it”. Workers using ChatGPT see corporate restrictions as damage and find their way around the injunctions.
EV member Gianni Giacomelli suggests that if employees don’t benefit from AI, they’ll resist it. To foster an open, AI-friendly workplace, companies should provide training and a clear career path that secures the employee in an AI-influenced landscape - for recommendations, check out Gianni’s piece.
See also:
Predictions of when we’ll have AGI have dropped from 30 years when GPT-3 was announced to 8 years today. Elon seems to think 3 years. (Azeem is more cautious.)
Microsoft shows a powerful prompting strategy that yields state-of-the-art benchmark performance for medical questions in GPT-4.
StabilityAI released SDXL Turbo, which generates images in real-time while you are writing the prompts (check out the video).
Techno-lords. In my book, I talked about a core challenge of exponential society:
that digital infrastructure undergirds much of our personal, political and public lives. If this infrastructure is exclusively owned and operated by private firms, we will increasingly cease to be fully fledged citizens. We will be mere consumers.
Recent years have proven this, as commentators like Ian Bremmer describe a “technopolar” world where big tech holds more power than nations. We are consumers, not citizens. Previously, tech companies’ influence was rather benign, but now, with growing global conflicts, their impact is more evident. Gary Marcus points out a few instances of this occurring, highlighting how big tech controls decisions about AI and technology-sharing with other nations. For example, Elon Musk refusing to let Ukraine use Starlink in Crimea, and OpenAI deciding to partner with UAE’s G42. We should not let a few techno-lords make decisions for us as consumers; instead, we as citizens need democratic control over these decisions. I spoke about some of the approaches to accomplishing this with Professor Hélène Landemore.
Industrious gnomes. DeepMind’s AlphaFold changed protein folding. Now, their GNoME project looks to have revolutionised material discovery, conceiving an astounding 380,000 (believed to be) stable materials. This development, if validated, would expand the known stable materials nearly tenfold. The next step is making them. Luckily, AI and robotics are also helping out with that, too. A-Lab, an automated material synthesis platform, created 41 new compounds in 17 days. While only a small number of GNoME’s materials will likely be usable, AI has yet again broadened our scientific horizons, potentially leading to breakthroughs in areas like battery technology or high-temperature superconductors. For more, read the detailed Wired article on GNoME.
Is China the new open-source leader? This week, two impressive Chinese models were released, DeepSeek 67B and Qwen 72B. Both performed exceptionally well on benchmarks and in the case of Qwen, even surpassed GPT-4 on Chinese-language benchmarks. This is coming off the back of the early November release of the Yi 34b model which currently has the highest score for pretrained models on the Open LLM leaderboard. But, echoing my discussion with Microsoft’s Brad Smith at Davos in January this year, can China actually produce open models? Try asking Qwen “What happened in Tiananmen Square in 1989?” Jimmy Wales has another eye-opening example with DeepSeek.
Market data
In 2023, Amazon has 750,000 robots in its warehouses, a steep rise from just 1000 a decade earlier.
Fund managers tend to attribute good performance to internal factors 41% more often than they do bad performance.
A study revealed that people considered ChatGPT’s advice more balanced, thorough, empathetic, helpful, and better than professional advice columnists’.
This Black Friday set a record with $9.8 billion in US online sales, up 7.5% from last year.
A new device accelerates the production of tailor-made chimeric antigen receptor (CAR) T-cell therapies. This innovation is a stride towards cost-effective manufacturing of personalised treatments, slashing the cost to approximately $30,000, a steep decline from the $475,000 benchmark set in 2017.
Deep tech startups need 35% more time and 48% more money than usual startups to make over $5 million in revenue.
Short morsels to appear smart at dinner parties
🚗 Detroit is now home to America’s first stretch of road that can wirelessly charge an electric vehicle.
🌱 The first organisational chart was a beautiful plant.
🌈 The science behind prisms and Pink Floyd’s iconic album cover explained. Via
💸 The chicken or the egg: people’s self-esteem increases when their income rises, not the other way around.
😝 An iPhone computational error produced an unreal picture.
🦜 Is my toddler a stochastic parrot? (h/t Catherine Lückhoff)
🤰 Mothers speaking to their unborn child could help its future linguistic skills.
🐙 How does it feel to have an octopus tentacle?
🪐 Amino acids, the building blocks for life, could have formed near new stars and planets.
End note
After we’d put this issue to bed, I came across an absolutely wonderful piece of reporting on the OpenAI saga from Charles Duhigg at The New Yorker. I enjoyed it with my Saturday morning coffee. The “Turkey-Shoot Clusterfuck” as Microsoft employees referred to it unveils some key details of the attempted boardroom coup, what drove the board to do this action, and how and why it went wrong. Also worth looking out for in this essay is the sympathetic portrait of Microsoft’s CTO, Kevin Scott.
Cheers,
Azeem
What you’re up to — community updates
John C. Haven co-authored a paper from the IEEE called “Prioritising People and Planet as the Metrics for Responsible AI”
- co-authored a paper in Nature called “Remote collaboration fuses fewer breakthrough ideas”
Bill Thompson has joined the Advisory Board of the Oxford Internet Institute.
Dan Smith took part in a podcast about the biodiversity impacts of offshore wind.
Carly Kind has been appointed as the new Australian Privacy Commissioner.
- DemocracyNext has been holding citizens’ assemblies about making museums more welcoming and inclusive.
Mohan Reddy’s SkyHive made Fast Company’s next big technologies working for social good in 2023 list.
Share your updates with EV readers by telling us what you’re up to here.