💡A short history of knowledge technologies
How does OpenAI's latest language model, GPT-3, compare to previous innovations?
This is a subscriber-only post which I am making open access until July 25th. Please consider subscribing.
How should we think about OpenAI’s new language model, GPT-3?
(For background on GPT-3, please read my essay AI & the Word, EV#277 on massive language models and EV#278 on some recent demos using GPT-3.)
The model, released with a very simple API that allows even people who write code at my level to play with it*, is getting people breathless, probably with good reason. (I also recommend that you read a typically excellent overview of GPT-3 and community reaction to it from Gideon Lichfield’s Technology Review. This also touches on a number of key criticisms around bias, and draws attention to various technical limitations.)
Knowledge matters
Technology helps us manipulate matter, energy or information. Today, courtesy of computation, we tend to start our experiments with manipulating information.
Written records allow us to stand on the shoulders of giants—and, crucially, bring too…
Keep reading with a 7-day free trial
Subscribe to Exponential View to keep reading this post and get 7 days of free access to the full post archives.