🔮 What would you do with an abundance of computing power?
Computing power as a strategic asset
What would you do with 1000x more computing power? How would your organisation use it?
If you were to ask these questions to Sam Altman or Satya Nadella or Sundar Pichai they would have an answer. Do you?
Several months ago, I took this question to my friend François Candelon, at the time Global Director of the BCG Henderson Institute (now a partner at the private equity firm Seven2). We agreed that we were both hearing concerns from senior leaders about whether compute would remain affordable, especially in the face of growing demands from AI.
But this is the wrong question to ask. The right question is: what would you do with an abundance of computing power?
For the past few months, a team from the BCG Henderson Institute (Riccarda Joas and David Zuluaga MartÃnez) and Exponential View’s Nathan Warren have been working on this question.
In Part 1 — we investigated the question on top of executives’ minds — will the boom in generative AI impact the availability of affordable computing power? We found that there likely will be an abundance of computing power in the near future despite the increasing demand. Under our bullish model we found that demand for genAI inference would not exceed ~34% of computing supply.
In Part 2, we’ll aim to get you thinking about the more exciting question: how can businesses leverage this abundance of computing power? Let’s go!
The power of compute
Computing power — the capability to analyse data for decision-making — fuels all our information systems. New businesses have come along using compute to disrupt the old. The ascent of digital-first businesses like Amazon, Netflix, Google and their peers was deeply disruptive for their respective industries. But there are many more non-obvious examples.
One such case is Moderna, valued today at ~$50 billion. Moderna was formed around 2010 and is referred to as a biotech startup born in the AWS cloud. Its computationally-driven approach, created over 10 years, made it possible for Moderna go in 42 days from analysing a sequence to finding the protein to target and 65 days to the first clinical grade batch of Covid-19 vaccine. This was unprecedented (and at the time, I wrote What if every day was a pandemic day?) and would normally take many years (2-5 years for R&D and 1-2 years for pre-clinical testing). Similarly, for its Zika virus vaccine, Moderna reached the clinical trial stage after a year; in traditional pharmaceutical R&D, it would take around 10 years.Â
Another case study we looked at, perhaps less well known, is Amadeus IT Group, valued today at ~$28 billion. The company specialises in providing transaction-processing solutions for search, pricing, booking and ticketing across airline, hospitality and car rental sectors. Amadeus is processes over 100,000 customer transactions per second at peak time, the same order of magnitude as the transactions by credit card company Visa, because of its trailblazing move in the early 2000s. The company shifted from a traditional mainframe system (IBM’s z/TPF) to open systems (a giant network of Linux machines) run on its own data centres and in 2016 to the cloud (Microsoft Azure). This allowed the company to accommodate the dramatic increase in the look-to-book ratio (from 10-to-1 to 1,000-to-1) prompted by the rise of online booking tools. z/TPF is a processing system that, in principle, was well-suited for handling transactions, yet the system was overwhelmed by the demand. Because of the advent of cloud computing around that time, more computing power was obtainable and Amadeus could run its operations at a scalable level.
What does this mean for you?
Computing power is an important strategic asset. Greater compute, at near-zero processing cost and coupled with AI capabilities, can have deeply disruptive effects.
So, if you’re a business leader, what should you do?
For computing power to be effective, high-quality data as input is essential. Without it, your computing power has nothing to process. Ninety percent of business data is unstructured; with generative AI, you now have a method to structure the data and create more organised data sources for your business.
Once you have your data, you should identify approaches that are currently not pursued because they seem computationally too difficult or costly. You should also inspect data that remains unused due to processing constraints. Leaders could go even further and ask themselves: What if I had infinite computing power at zero cost and no environmental concerns? What if we reach a point where we have too much computing power?
One approach to answering these questions is to revisit your company’s value chain to explore the space of possibilities. This includes, fundamentally, questioning current business assumptions, searching for analogies to inspire imagination and thinking through the break-even point of investment decisions into computing power. We considered three areas here: (1) value proposition, (2) value creation and delivery and (3) value capture.
Value proposition: Most companies struggle to create lasting customer satisfaction and to cater to the changing preferences. To offer uniqueness to customers, companies would need to tailor their value propositions for each customer segment as closely as possible. There are vast differences among individuals and segments, so this can be incredibly difficult. Increasing the computing power is part of the solution – if, for example, Meta Ads or Spotify could process an even larger amount of data about their users, your offering on those platforms would be more precisely tailored to you.
Leaders should ask: How would computing power impact the customer experience and strengthen relationships? What are the problems that computing power cannot solve and why?
Value creation and delivery: Many companies deal with insufficient capabilities, resources and poor process design. Even well-resourced organisations like Pfizer or Alphabet's Calico face these challenges to some degree. However, imagine if these companies could overcome such limitations and create even more complex models to simulate molecular interactions—they might get closer to finding a cure for cancer or a recipe for longevity.
Leaders should ask: What capabilities and engineering skills are needed to optimise or automate given more computing power? What is the impact of computing power on my data requirements and intellectual property? What might happen beyond the boundaries of my firm? How is the ecosystem evolving?
Value capture: Businesses are confronted with competitive pricing tactics by competitors and consumer price sensitivity. Consider route optimisation algorithms used by companies like DHL or Maersk, which have lowered operational costs in the past. Imagine if it were possible to include even more input parameters and further accelerate computing time.
Leaders should ask: How might computing power influence the revenue model to choose, e.g., freemium, membership, subscription, or other new forms? How might increasing computing power lower costs?
Computation has enabled humankind to solve puzzles and tackle ever-increasing problems. Given current dynamics, now might be the time for big-scale considerations—what is the job to be done—and then leverage computing power, once available at the required scale, to enable those considerations. For some problems that are currently not economically viable or feasible, computing power might lead to a state where they become possible. And because new computational techniques in AI are allowing us to leverage and process data in fundamentally new ways, we have a whole class of new problems we can throw compute at.
For more, read:
This is a great way to break down the opportunity. The additional layer that I would add, to truly make the most of computing capacity, is to harness the signals from the collective intelligence of an organization, deliberately mining both the content and the network structure of enterprise organizations and systems. A large part of and organizations cognitive capacity sits There. Augmented by AI, that can become a truly transformative cognitive system.
I recently added more RAM, so now I am maxed out at 128GB. This is a stop gap until I can get a Mikubox together with 1TB of RAM, and 3x accelerator cards. That way I can run large LLM's locally, It's not important the accelerators are fast just that they have VRAM.