🤖 The state of AI
For the past three years, Ian Hogarth and Nathan Benaich have put together the State of AI Report. Things are happening fast, 2020’s edition is 30% bigger than 2019’s.
Key highlights
A new generation of transformer models is unlocking NLP use-cases. (Read my essay on the growth of NLP)
Huge models and massive training costs dominate today’s hottest area
Biology is experiencing its AI moment
The report is very rich and touches on other issues relating to talent and policy. Worth digging in.
Big models
A lot has been made about the size of the new models—and how they might put such research out of reach of firms with smaller resources. Some people suggest OpenAI’s latest model, GPT3, might have cost $10m to train. (I actually discuss this point with Sam Altman, the head of OpenAI, in the new season of the Exponential View podcast. Be sure to tune in from Wednesday 7th October.)

But the analysis is not so simple. The larger, more complex models, need less data than a smaller peer to get the sam…
Keep reading with a 7-day free trial
Subscribe to Exponential View to keep reading this post and get 7 days of free access to the full post archives.