Exponential View

Exponential View

Share this post

Exponential View
Exponential View
🤖 The state of AI
Copy link
Facebook
Email
Notes
More

🤖 The state of AI

Azeem Azhar
Oct 03, 2020
∙ Paid
23

Share this post

Exponential View
Exponential View
🤖 The state of AI
Copy link
Facebook
Email
Notes
More
Share

For the past three years, Ian Hogarth and Nathan Benaich have put together the State of AI Report. Things are happening fast, 2020’s edition is 30% bigger than 2019’s.

Key highlights

  • A new generation of transformer models is unlocking NLP use-cases. (Read my essay on the growth of NLP)

  • Huge models and massive training costs dominate today’s hottest area

  • Biology is experiencing its AI moment

The report is very rich and touches on other issues relating to talent and policy. Worth digging in.

Big models

A lot has been made about the size of the new models—and how they might put such research out of reach of firms with smaller resources. Some people suggest OpenAI’s latest model, GPT3, might have cost $10m to train. (I actually discuss this point with Sam Altman, the head of OpenAI, in the new season of the Exponential View podcast. Be sure to tune in from Wednesday 7th October.)

But the analysis is not so simple. The larger, more complex models, need less data than a smaller peer to get the sam…

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 EPIIPLUS1 Ltd
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More