Exponential View

Exponential View

Share this post

Exponential View
Exponential View
🔮 Breaking the energy barrier with reversible computing
Copy link
Facebook
Email
Notes
More

🔮 Breaking the energy barrier with reversible computing

As energy toll of AI growth and computation is expanding, the science of reversible computing is more important than ever

Azeem Azhar
and
David Galbraith
May 22, 2024
∙ Paid
53

Share this post

Exponential View
Exponential View
🔮 Breaking the energy barrier with reversible computing
Copy link
Facebook
Email
Notes
More
2
4
Share

Hi all,

I recently had a chat with EV member David Galbraith. David is an entrepreneur, VC, co-author of the RSS standard and former architect. David and I have known each other for decades… In fact, since Cher last had a number one single.

But every time we talk, I learn something new. This time, David shared his fascination with reversible computing and how it could help resolve the tension between AI progress and the energy demands of computation. We know that we have reached an impasse within the current computing paradigm.

So I invited David to bring us all up to speed. In today’s guest essay on reversible computing, David breaks down:

  • The growing energy demands of computing, especially as AI advances, and the limits of Moore’s Law.

  • The link between thermodynamics and information theory, as current computing methods generate significant heat through irreversible processes.

  • How reversible computing can reduce energy consumption and heat production by allowing operations to be undone and energy to be recaptured.

Reversible computing is not easy to understand at first, so be patient with yourself.

Thanks to David for taking the time to bring this important topic to Exponential View.

Share


Part 1: An introduction to reversible computing

By David Galbraith

The world’s computers consume vast quantities of energy for operation and cooling. AI is intensifying this consumption, potentially tripling its current demand. Simultaneously, Moore’s law (the number of transistors per chip doubling every two years) is already faltering, just as demand for computational power soars. The resulting surge in electricity demand for computation coincides with existing pressures on the grid, from the transition to renewables, electric vehicles and heat pumps. 

We are entering a second era of electricity, where transportation, heating and knowledge are all electric. Nikola Tesla in his lab.

The coupling of energy and information markets at an economic level mirrors the relationship between information theory and thermodynamics at a scientific level.

Technologies applied at this intersection could potentially reduce computational energy demands and deliver continued increases in performance and efficiency.

The laws of physics are poised to shape the future of computing and the digital economy that relies on it. The physics governing the relationship between information and energy dictate the heat generated by today’s computers. This is not only due to electrical inefficiency but also because, even with 100% electrical efficiency, the process of information deletion itself transforms into heat.

To overcome the physical limits of current computer performance, alternative computing architectures are required. The popular imagination is captivated by options for massive increases in performance such as could be provided by quantum computing. While certain applications and experiments are in progress, achieving broad commercial viability and full technological maturity remains several years away, based on ongoing challenges in qubit stability and error rates.

One potential solution exists here and now: reversible computing (one of the requirements for efficient quantum computers). In the reversible computing model, operations that process logic can be reversed by having outputs that enable determination of the original inputs and recapture energy.

This process helps to minimise heat emission that typically result from lost information. This reverse operation is in some ways similar to the cyclical operations of machines such as pendulums and combustion engines. Its scientific foundation takes us back through the earliest days of digital computing and beyond, to industrial era steam engines and the theory of heat or, in physics speak, thermodynamics.

Nobody is sure what the exact use cases for quantum computing are, but more importantly, they are way off and require scientific breakthroughs to develop. One of their sub-requirements is reversible computing, a here and now technology that could push progress in computation on its own.

Heat and information entropy

The law that this potential reduction in energy all depends on is the second law of thermodynamics, which states that entropy (unusable energy or disorder) increases over time. As a result, everything ultimately runs down and releases waste heat. 

This is the only fundamental law of physics that explicitly involves irreversibility and sets the arrow of time in one direction. Despite Newton’s laws of motion being reversible, vases crash on the floor into millions of pieces, but the pieces never crash into a vase, milk stirs into coffee and never out of it and hot things always cool by releasing their heat into a cooler environment. 

Current computers also work irreversibly in terms of logic, the logic gates lose information both from electrical operation and from the one way loss of information itself, which is ultimately released as heat. Making computers reversible is the only way to avoid there being an absolute minimum heat output and matched input without violating the 2nd law. Current irreversible computing is, as a result, moving toward a looming barrier to progress dictated by the laws of physics, as we’ll see below.

The words are misleading but the underlying principles are straightforward when it comes to understanding entropy.

Quantifying heat entropy is what enabled the scientific understanding of steam engines, the quintessential machines of the industrial era, in order to make them efficient. It ultimately led from mechanical engineer Sadi Carnot’s understanding of their engineering properties, to physicist Ludwig Boltzmann’s articulation of what was actually happening at the fundamental level of particles.

If the Eiffel tower is the ultimate symbol of the industrial age (its shape is based on a steam hammer), there is an oblique connection to entropy. The Boltzmann formula for entropy (which is enshrined on his tombstone), uses the symbol S, from Sadi Carnot whose nephew, also called Sadi Carnot, was the president of France when the tower was built.

If the definition of entropy as disorder sounds vague or unclear, that’s because it is. When Claude Shannon was developing the theory of information that underpins the science of computers, the defining machine of the post-industrial era, he was told by von Neumman to call it entropy. The reason? Because “nobody really knows what entropy is anyway”.

Shannon is up there with Einstein in terms of his contribution to our profound knowledge of the world via his scientific description of information.

We ended up with a theory of information and a theory of energy that are both based around a quantity called entropy, with nearly identical equations defining them. However, their relationship was assumed to be merely an analogy, with information entropy measured in bits and heat entropy in energy per unit temperature.

There had always been a suspicion that the relationship between information and heat was more than a mere analogy, through a thought experiment called Maxwell’s Demon. Maxwell’s Demon showed that entropy increase could be reversed without expending energy, if a tiny demon moved a door to only let certain particles through when mixing a gas. This would then violate the second law of thermodynamics unless information transfer by the demon was equivalent to energy transfer, thus resolving the paradox. It would be nearly a century before anyone proposed a specific, quantitative relationship between energy and information, however.

A little demon could unmix a lukewarm gas into hot and cold sides, seemingly violating the second law by letting the right particles through.

Finding a connection between information and energy

In 1961, physicist Rolf Landauer at IBM theorised that erasing information from a computer is inherently linked to an increase in heat entropy, effectively establishing a formal connection between information and energy. 

Landauer created a formula for the energy in joules released per binary bit deleted. If we express his original equation in natural units of energy and information, the equation is astoundingly simple:

Energy (in natural units) = Information (in nats)1

Now, this doesn’t prove that energy and information are equivalent in the same way that Einstein proved that mass and energy were equivalent, but it is a tantalising possibility.

Landauer showed what happens when a bit is deleted and energy is released. This doesn’t necessarily mean that energy and information are equivalent, but it is a tantalising possibility.

This post is for paid subscribers

Already a paid subscriber? Sign in
A guest post by
David Galbraith
Investable technology trends and insights from an entrepreneur turned investor.
Subscribe to David
© 2025 EPIIPLUS1 Ltd
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More