We heard today that OpenAI will launch a $20 a month tier for ChatGPT, its chatbot service. This premium service offering higher uptime and new features will only launch in the US. Other users will have more limited access to the system.
How much is this premium offering likely to make?
According to estimates from SimilarWeb, ChatGPT hit 100m active users in January. This represents the fastest growth of any Internet product, well ever, really. SimilarWeb further reckons that roughly 20% of visitors to ChatGPT are from the US.
SimilarWeb’s data for December suggests a 7 minute average duration for a visit. This compared healthily to Facebook, at 9.5 minutes and Wikipedia at nearly 4 minutes, but was far behind YouTube at 20 minutes or Google at more than 10.
December’s 266m visits came from around 13m visitors, suggesting about 20 visits per visitor per month. Again, this is impressive. Twitter’s website had 7bn visits in December, Reddit around 1.8bn. So how could this all translate to revenue?
Let’s start with some static estimates.
In January, 100m MAUs of which about 20% were American means about 20m American users. At different conversion rates, this converts to monthly revenues at the following levels:
1% conversion rate: $4m/month
2% conversion rate: $8m/month
5% conversion rate: $20m/month
10% conversion rate: $40m/month
To contextualise, typical freemium conversion rates are in the 2-5% range, although a recent OpenView report put it at around 6%. Exponential View’s conversion rate is in that range. Spotify’s is a whopping 46%.
A December estimate reckoned that ChatGPT cost about $3m per month to run, although volumes, but not costs, have probably increased by an order of magnitude since then.
What about more dynamic estimates?
What if use of ChatGPT continued to climb by 10% per month and they opened paid for service to all users? That growth level is much lower than the service has seen so far. It is also well below fast growing consumer Internet services’ experience during their most rapid growth phases.
Let’s assume 10% per month, a rough doubling every seven months or so. That would take the service to 300m users by the end of the year (approximately). At similar conversion rates, this could put exit revenues for 2023 at:
1% conversion rate: $60m/month
2% conversion rate: $120m/month
5% conversion rate: $300m/month
10% conversion rate: $600m/month
Even in its wonky, shonky beta form with timeouts and failure messages, ChatGPT has become an essential part of my workflow. I’d be willing to pay for it (but I won’t say how much until you have answered the poll above.)
I’d love the ability to fine tune ChatGPT more than I currently can. Although to caveat that, I’m not sure if that is just a “faster horse” fantasy on my part. I did some intellectual jamming with it last night and I got mentally tired before I reached the limit of ChatGPT’s ability to help me explore the ideas we were discussing. So I'm getting a lot out of it without any fine-tuning. (You can read how we use ChatGPT here.)
And because of that, I can see why it could become such a hugely appealing product for anyone involved in “knowledge production”. That spans a gamut from marketers to coders to salespeople to people just running their daily lives.
Of course, cheaper “just good enough” services might also be waiting in the wings. To write a letter of complaint to the local city council you might not need an all-singing, all-dancing premium service, after all.
So there are competitive dynamics to consider as well. Competitors might serve less demanding users with lower-cost (or even free) products. Apple, for example, might just bundle such capabilities within its products, rolled out as a software update for free. How many of us pay for Fantastical1 rather than just use the built-in iOS calendar software?
Google could build more chat-like (or large-language model) features into its existing search applications, much as Neeva and You.com have done.
That competition could eat away at the rosy back-of-envelope calculations I’ve made above.
I’d love your perspective and analysis too. The comments are open to subscribers.
I pay for Fantastical. Recognise I am in a minority here.
I've been wondering what 'job' ChatGPT does - either supplementing / changing existing roles, or creating new ones. Many of the use cases shared so far feel like flavors of 'assistant', what we often call junior staff, or, in older domains, apprentice. I don't see concrete applications for that in spaces like mine right now (leading a design / strategy function for a Big Tech business unit). There's other vectors for finding a path to productization, of course, but there's also ample history and data for this one across industry and academia broadly.
There is a better ChatGPT called Claude, built by a load of OpenAI alums who jumped ship after GPT3, (Possibly earlier) harder to Jail break, doesn't make so many obvious mistakes, and admits to them when it does.
I'd be interested to chat to ChatGPT, if it could actually chat, i did try to get it to write stuff, but it just parroted stuff back at me for my use case. Run the male and female Hero's Journey back to back, etc. Though I did find somebody who was working out how to give it a blank slate and prime it for what you want. Not been able to get a connection to it since. Kobold AI isn't bad if you install it locally. That has a range of modes, Game, Story, Chat, etc. And different models you can run. Might be worth a look for the technically competent. It's a Python install that provides a Web interface on a local port. Just like Standard diffusion.