In the 1970s, few companies could afford computers. Large mainframes were expensive. The IBM System/370 cost $4,674,160 in 1970 (about $35m in todays terms). The desktop PC revolution changed that. By the early 1980s, the entry price for a usable computer was more than 1,000 times lower than a decade earlier. Over the next 20 years, their sticker price declined gently, but their capabilities (described by Moore’s Law, Kryder’s Law, Keck’s Law and Gilder’s Law) improved dramatically.
The result was a huge expansion - a democratisation - in the number of organisations that could afford a computer. No longer was it NASA or Midland Bank or American Airlines. Small firms were able to take advantage of digital computing. My parents’ small accountancy practice, initially just my late father and my mum in the East End of London, could afford an Apple II Europlus computer with 64k RAM and two floppy discs. Running an integrated accounting package from Basingstoke’s Padmede Computer Services, our computer supported the early growth of my dad’s practice.
We’ve lived through the impact of technology’s exponential deflation. It leads to a democratisation of capabilities. Things only governments or large firms have access to become available to more and more firms across the economy.1
Let’s use that lens to look at the impact of the modern cohort of AI tools based on large-language models, in particular chatbots.2
I’ve been using OpenAI’s chatbot, ChatGPT, pretty heavily as an assistant over the past weeks. But I also use Bearly (for summarisation) and other tools, like WordTune. What expensive resource do these software machines mimic?
Keep reading with a 7-day free trial
Subscribe to Exponential View to keep reading this post and get 7 days of free access to the full post archives.