- A new method could allow modern CPUs to perform computations using 1,000 times less energy than Landauer limit (0.0172 electronvolts).
- It works by synchronizing the operations with the processor’s temperature oscillations.
In digital electronics, energy dissipation is one of the main design considerations today. As processing units continue to shrink, the voltage required per computation decreases. Sooner or later, transistors will hit a theoretical limit for the minimum voltage required to perform a single task.
Recently, a professor, Jan Klaers, at the University of Twente, Netherlands proposed ‘squeezed thermal states’ that can be used to get around this theoretical limit. These states force the processing unit to effectively perform at lower temperatures while consuming less energy.
Existing computer technology can take advantage of these states that occur naturally within the thermal environment of a processor. And in the future, squeezed states could be exploited to develop more energy-efficient electronic devices.
Amount of Energy Required To Erase A Bit
In 1961, a German-American physicist, Rolf Landauer developed a digital bit thermodynamic model while working for IBM. According to this model, any logically irreversible operation that alters data, such as resetting or erasing a bit of memory, increases the entropy and a corresponding amount of energy is dissipated as heat.
He calculated the lowest possible energy that would be needed to erase or reset a bit (from 1-state to 0-state), and it turned out to be 0.7*k(B)*T, where k(B) is the Boltzmann constant and T is the surrounding temperature.
This principle is also relevant to quantum information, quantum computing and reversible computing. In 2012, researchers performed experiments to confirm this principle.
However, the principle works only if the system is in thermal equilibrium. Jan Klaers has now built a new approach in which the bit remains in equilibrium and the ‘heat bath’ (surrounding environment) is kicked out of equilibrium by thermally squeezing the bath.
The term ‘squeezing’ refers to fluctuations (associated with noise) that are unevenly scattered across the different dimensions of the system. The bath oscillates between 2 temperatures in the squeezed phase: one below and one above the average temperature. This means that the bath could be either cold or hot at a particular moment.
A network of connected bits in a computer circuit | Performing operations on these bits requires energy which eventually dissipates as heat (red) | Credit: J. Klaers/University of Twente
In order to erase/reset the bit using minimum possible energy, it’s necessary to carry out operations when the bath is in its cold state. Thus, one could synchronize the operations with temperature oscillations to reduce the cost of energy for resetting the bit. In fact, Klaers discovered that there is no lower limit on the energy cost: the more you squeeze the bath, the lower the cost of energy.
Can We Apply This Technique In Existing Computers?
According to the researcher, this method could be used for today’s computer bits, and it has the potential to provide 1,000 times more energy savings than the Landauer limit.
There is a big advantage of using this technique in modern computer technology: the squeezed thermal surroundings come for free. Modern processors perform computation by erasing or switching millions of bits, and each operation releases a certain amount of heat.
Since the heat emission occurs at the frequency determined by the CPU clock, the temperature of the bath oscillates. Although such oscillations have intricate patterns, the behavior is a form of squeezing.
Engineers could configure this squeezing in a way that bits are always processed when they are in cold state. In this way, computations driven by thermal oscillations would consume less power, but exactly how much less still needs to be examined.