Quantum memory could write itself

CQT researchers and their collaborators define measure of memory capacity as a quantum thermodynamic resource
18 March 2019

Artist's illustration of a system in equlibrium The definition of equilibrium in a quantum system differs to that in a classical system, requiring new approaches to the science of thermodynamics. A team including CQT researchers has defined a new measure of free energy for quantum systems called the 'thermal information capacity’. Image:niroworld/Shutterstock.com

The phrase ‘knowledge is power’ holds true in physics. In the science of thermodynamics, the capacity for a system to do work turns out to be closely connected to its capacity to store information. This has led to a well-developed theory of information thermodynamics, where energy sources are no longer just cold and hot reservoirs, but any structured data within noise.

The world, however, is fundamentally quantum. Both energy and data can exist in quantum superpositions. In such context, the conversion between and power and knowledge in quantum systems is far less well understood.

In a paper published 13 February in Physical Review Letters, CQT’s Mile Gu and his colleagues sought to crystallise these links. Together, they developed a new means to characterise free energy at the quantum scale by considering the capacity to store information. Part of this achievement is working out an explicit writing mechanism, one in which a quantum system stores structured data while using no sources of energy other than what is contained within itself.

“The obvious application of this would be nano-engines that are able to extract work from structure at the quantum-scale,” says Mile. It’s a technique that could ultimately find applications in sensors and computers operating in remote environments with limited energy.

The paper’s authors are Mile and Jayne Thompson at the Centre for Quantum Technologies at the National University of Singapore, Varun Narasimhachar at the Nanyang Technological University (NTU), Jiajun Ma at Tsinghua University, China and Gilad Gour at the University of Calgary, Canada. Mile, an NRF Research Fellow, is also on the faculty of the Complexity Institute and School of Physical and Mathematical Sciences at NTU.

Information engines

Thermodynamics emerged in the 19th Century from studies of steam engines. The first formulations of its laws came from calculating how to get energy out of gases at different temperatures. Even then, there were hints of something overlooked.

A core idea is that engines operate by having a temperature gradient: we extract useful work by transferring heat from the hot to the cold reservoir. Thus, our standard engines burn fuel to make something hot. However, as heat is transferred from the hotter to the colder object, everything eventually ends up in thermal equilibrium. Once that happens, no more work is possible. This is essentially the second law of thermodynamics.

In 1867, the physicist James Clerk Maxwell proposed a thought experiment that seemed to violate the second law. He imagined a box with two compartments containing gas in thermal equilibrium. The second law of thermodynamics would say that it is impossible to use this box to do useful work.

Maxwell imagined adding a ‘demon’ operating a door between the compartments. The demon monitors which particles are hotter than average and lets them pass the door from left to right. Meanwhile the demon lets colder particles pass only right to left. Slowly but surely, the demon establishes a temperature gradient. Hooking up a heat engine would then seemingly get you work for free.

Ultimately, the resolution to this paradox is information. Tracking the gas particles requires the demon to have a memory, which consumes energy and so explains why the engine would not actually provide work for free. Scientists continued to study Maxwell’s demon, refining ideas about the connection between information and work over the next 100 years and more.

Notable developments included ‘Szilard’s Engine’, a thought experiment proposed in 1929 which showed directly that work could be extracted from information. ‘Landuaer’s principle’, developed in the 1960s, sets a lower bound on the energetic costs of storing information within memory.

Quantum consequences

Such research was focused on classical information, assuming bits that are 0s or 1s. With the more recent advent of quantum information, researchers are revisiting established ideas of thermodynamics to check how they apply to quantum bits, which can exist in superpositions of states.

Going quantum has consequences. Varun, first author on the paper, explains: “The key feature to recognise is that anything out of equilibrium is a potential source of free energy. When you get to quantum scales, there are fundamentally different ways to measure how far from equilibrium you are. In classical physics, being out of equilibrium means having a different distribution over the energy spectrum. Quantum systems can also enable quantum coherences between energy levels. You can get states that are indistinguishable from equilibrium when measured in the energy basis, and yet are not in equilibrium.”

In their work, the researchers propose a means to characterize the energetic content within such states using a new measure. They defined a quantity called the ‘thermal information capacity’, which is the average number of bits a system could retain with no external energy source. For example, should a qubit have thermal capacity of 0.5, then two of them combined could retain one bit of data.

“We show that the information you can store is a new way to measure free energy that works at the level of individual quantum particles,” says Varun

The work adds to researchers’ developing understanding of quantum thermodynamics. “I think the result particularly nice in uniting two distinct concepts – out of equilibrium quantum systems and memory capacity,” says Mile.