Introduction to Quantum Thermodynamics ::
The theory of quantum thermodynamics investigates how the concepts of heat,
work, and temperature can be carried over to the quantum realm, where fluctuations and randomness are fundamentally unavoidable. Of particular practical relevance is the investigation of quantum thermal machines: Machines that use the flow of heat in order to perform some useful task. In this lectures series, we give a brief introduction into how the laws of thermodynamics arise from quantum theory and how thermal machines can be described with Markovian quantum master equations. Recent results are illustrated with examples such as a quantum dot heat engine and a qubit entangler
Quantum thermodynamics investigates heat, work, temperature, as well as related concepts
in quantum systems. As these concepts are very general, quantum thermodynamics is of
relevance for essentially all scenarios involving open quantum systems. This makes the field
extremely broad, including well established topics such as thermoelectricity
investigating
how an electrical current arises from a temperature gradient, as well as novel approaches such
as the resource theory of thermodynamics , an approach that has its origins in entanglement
theory. The broad nature of the field implies that the quantum thermodynamics community
brings together people with very different backgrounds that have common interests.
This course is meant to provide a short introduction to this diverse and fast-growing field,
introducing basic concepts and illustrating them with simple examples. After this course, you
should…
- …know how the first and second law of thermodynamics emerge from quantum theory
- …know what a Markovian master equation is and under what assumptions it can be
employed, - …anticipate what happens when coupling simple systems (such as a qubit or a quantum
dot) to one or several thermal reservoirs, - …be able to calculate observable properties of out-of-equilibrium systems such as heat
and charge currents.
There are several good resources that substantially go beyond the material covered in this
course. Very recently, a book on the topic was published that is meant to give a snapshot
of the field, providing numerous short reviews on different topics by over 100 contributing
authors - A number of good reviews are provided by These resources are
complemented by the focus issue in .
Since this course aims at introducing an advanced topic in a short amount of time, some
concepts will be used without introduction. Notably, the density matrix formalism and second
quantization will be used throughout the course.
for a good introduction to these respective topics. In addition, basic knowledge of
quantum theory and classical thermodynamics is helpful.2 Basic concepts
In this section, we introduce some basic concepts that are used throughout the course. We set
~ = 1 throughout the course.
2.1 The thermal state
In the grand-canonical ensemble, the thermal state (Gibbs state) is given by
τˆβ,µ =
e
−β(Hˆ −µNˆ)
Z
, Z = Tr n
e
−β(Hˆ −µNˆ)
o
, (1)
where Hˆ denotes the Hamiltonian of the system, Nˆ the particle number operator, β = 1/(kBT)
the inverse temperature (with kB being the Boltzmann constant), µ the chemical potential, and
Z is called the partition function. There are different motivations for the physical relevance of
the thermal state. Consider a small part of a large system, where the large system has a fixed
energy and fixed number of particles. It can be shown that the small system is described by
the thermal state (under some assumptions) Therefore, if a system can exchange both
energy and particles with an environment, the system is expected to be well described by the
thermal state when it is in equilibrium with the environment. The mentioned assumptions
will be discussed later in the course, when we discuss equilibration in terms of microscopic
equations of motion.
The thermal state can also be motivated from a principle of maximal entropy Consider a scenario where the mean energy hHˆ i and the mean particle number hNˆi are given
(physically, they are determined by the temperature and the chemical potential of an environment). In this case, the thermal state maximizes the von Neumann entropy (see box). To see
this, consider a state ˆρ with the same mean values as the thermal state. We can then write
SvN[ˆρ] = −Tr {ρˆln ˆρ} ≤ −Tr {ρˆln ˆτβ,µ}
= −Tr n
ρˆ
h
−β
Hˆ − µNˆ
− lnZ
io = SvN[ˆτβ,µ],
(2)
where we have used the inequality
S[ˆρ||σˆ] = Tr {ρˆln ˆρ − ρˆln ˆσ} ≥ 0. (3)
Here S[ˆρ||σˆ] denotes the quantum relative entropy (see box) and the equality is obtained only
for ˆρ = ˆσ.
Finally, the thermal state is the only completely passive state [6]. This means that even if we
have many copies of a thermal state, its energy cannot be lowered by any unitary operation,
i.e.,
Tr n
HˆN Uˆτˆ
⊗N
β,µ Uˆ†
o
≥ Tr n
HˆN τˆ
⊗N
β,µ o
, ∀N, U, ˆ (4)
where HˆN = Hˆ ⊗N denotes the Hamiltonian corresponding to N copies of the thermal state.
The last expression can be interpreted as follows: no work can be extracted from N copies of
3
the thermal state. It can be proven, that the thermal state is the only stat
Imagine a billiard ball bouncing around on a pool table. High-school level physics enables us to predict its motion until the end of time using simple equations for energy and momentum conservation, as long as you know the initial conditions – how fast the ball is moving at launch, and in which direction.
What if you add a second ball? This makes things more complicated, but predicting the future state of this system would still be possible based on the same principles. What about if you had a thousand balls, or a million? Technically, you could still apply the same equations, but the problem would not be tractable in any practical sense.
Thermodynamics lets us make precise predictions about averaged (over all the particles) properties of complicated, many-body systems, like millions of billiard balls or atoms bouncing around, without needing to know the gory details. We can make these predictions by introducing the notion of probabilities. Even though the system is deterministic – we can in principle calculate the exact motion of every ball – there are so many balls in this system, that the properties of the whole will be very close to the average properties of the balls. If you throw a six-sided die, the result is in principle deterministic and predictable, based on the way you throw it, but it’s in practice completely random to you – it could be 1 through 6, equally likely. But you know that if you cast a thousand dice, the average will be close to 3.5 – the average of all possibilities. Statistical physics enables us to calculate a probability distribution over the energies of the balls, which tells us everything about the average properties of the system. And because of entropy – the tendency for the system to go from ordered to disordered configurations, even if the probability distribution of the initial system is far from the one statistical physics predicts, after the system is allowed to bounce around and settle, this final distribution will be extremely close to a generic distribution that depends on average properties only. We call this the thermal distribution, and the process of the system mixing and settling to one of the most likely configurations – thermalization.
For a practical example – instead of billiard balls, consider a gas of air molecules bouncing around. The average energy of this gas is proportional to its temperature, which we can calculate from the probability distribution of energies. Being able to predict the temperature of a gas is useful for practical things like weather forecasting, cooling your home efficiently, or building an engine. The important properties of the initial state we needed to know – energy and number of particles – are conserved during the evolution, and we call them “thermodynamic charges”. They don’t actually need to be electric charges, although it is a good example of something that’s conserved.
Let’s cross from the classical world – balls bouncing around – to the quantum one, which deals with elementary particles that can be entangled, or in a superposition. What changes when we introduce this complexity? Do systems even thermalize in the quantum world? Because of the above differences, we cannot in principle be sure that the mixing and settling of the system will happen just like in the classical cases of balls or gas molecules colliding.
It turns out that we can predict the thermal state of a quantum system using very similar principles and equations that let us do this in the classical case. Well, with one exception – what if we cannot simultaneously measure our critical quantities – the charges?
One of the quirks of quantum mechanics is that observing the state of the system can change it. Before the observation, the system might be in a quantum superposition of many states. After the observation, a definite classical value will be recorded on our instrument – we say that the system has collapsed to this state, and thus changed its state. There are certain observables that are mutually incompatible – we cannot know their values simultaneously, because observing one definite value collapses the system to a state in which the other observable is in a superposition. We call these observables noncom muting, because the order of observation matters – unlike in multiplication of numbers, which is a commuting operation you’re familiar with. 2 * 3 = 6, and also 3 * 2 = 6 – the order of multiplication doesn’t matter.
Electron spin is a common example that entails noncom mutation. In a simplified picture, we can think of spin as an axis of rotation of our electron in 3D space. Note that the electron doesn’t actually rotate in space, but it is a useful analogy – the property is “spin” for a reason. We can measure the spin along the x-,y-, or z-axis of a 3D coordinate system and obtain a definite positive or negative value, but this observation will result in a complete loss of information about spin in the other two perpendicular directions.
If we investigate a system that conserves the three spin components independently, we will be in a situation where the three conserved charges do not commute. We call them “non-Abelian” charges, because they enjoy a non-Abelian, that is, noncom muting, algebra. Will such a system thermalize, and if so, to what kind of final state?
This is precisely what we set out to investigate. Noncom mutation of charges breaks usual derivations of the thermal state, but researchers have managed to show that with non-Abelian charges, a subtly different non-Abelian thermal state (NATS) should emerge. Myself and Nicole Younger Halpern at the Joint Center for Quantum Information and Computer Science (Quicks) at the University of Maryland have collaborated with Amir Kaleva from the Information Sciences Institute (ISI) at the University of Southern California, and experimentalists from the University of Innsbruck (Florian Kranzl, Manoj Joshi, Rainer Blatt and Christian Roos) to observe thermalization in a non-Abelian system – and we’ve recently published this work in PRX Quantum .
The experimentalists used a device that can trap ions with electric fields, as well as manipulate and read out their states using lasers. Only select energy levels of these ions are used, which effectively makes them behave like electrons. The laser field can couple the ions in a way that approximates the Heisenberg Hamiltonian – an interaction that conserves the three total spin components individually. We thus construct the quantum system we want to study – multiple particles coupled with interactions that conserve noncom muting charges.
We conceptually divide the ions into a system of interest and an environment. The system of interest, which consists of two particles, is what we want to measure and compare to theoretical predictions. Meanwhile, the other ions act as the effective environment for our pair of ions – the environment ions interact with the pair in a way that simulates a large bath exchanging heat and spin.
If we start this total system in some initial state, and let it evolve under our engineered interaction for a long enough time, we can then measure the final state of the system of interest. To make the NATS distinguishable from the usual thermal state, I designed an initial state that is easy to prepare, and has the ions pointing in directions that result in high charge averages and relatively low temperature. High charge averages make the noncom muting nature of the charges more pronounced, and low temperature makes the state easy to distinguish from the thermal background. However, we also show that our experiment works for a variety of more-arbitrary states.
We let the system evolve from this initial state for as long as possible given experimental limitations, which was 15 Ms. The experimentalists then used quantum state tomography to reconstruct the state of the system of interest. Quantum state tomography makes multiple measurements over many experimental runs to approximate the average quantum state of the system measured. We then check how close the measured state is to the NATS. We have found that it’s about as close as one can expect in this experiment!
And we know this because we have also implemented a different coupling scheme, one that doesn’t have non-Abelian charges. The expected thermal state in the latter case was reached within a distance that’s a little smaller than our non-Abelian case. This tells us that the NATS is almost reached in our experiment, and so it is a good, and the best known, thermal state for the non-Abelian system – we have compared it to competitor thermal states.
Working with the experimentalists directly has been a new experience for me. While I was focused on the theory and analyzing the tomography results they obtained, they needed to figure out practical ways to realize what we asked of them. I feel like each group has learned a lot about the tasks of the other. I have become well acquainted with the trapped ion experiment and its capabilities and limitation. Overall, it has been great collaborating with the Austrian group.
Our result is exciting, as it’s the first experimental observation within the field of non-Abelian thermodynamics! This result was observed in a realistic, non-fine-tuned system that experiences non-negligible errors due to noise. So the system does thermalize after all. We have also demonstrated that the trapped ion experiment of our Austrian friends can be used to simulate interesting many-body quantum systems. With different settings and programming, other types of couplings can be simulated in different types of experiments.
The experiment also opened avenues for future work. The distance to the NATS was greater than the analogous distance to the Abelian system. This suggests that thermalization is inhibited by the noncom mutation of charges, but more evidence is needed to justify this claim. In fact, our other recent paper in Physical Review B suggests the opposite!
As noncom mutation is one of the core features that distinguishes classical and quantum physics, it is of great interest to unravel the fine differences non-Abelian charges can cause. But we also hope that this research can have practical uses. If thermalization is disrupted by noncom mutation of charges, engineered systems featuring them could possibly be used to build quantum memory that is more robust, or maybe even reduce noise in quantum computers. We continue to explore noncom mutation, looking for interesting effects that we can pin on it. I am currently working on verifying the workings of a hypothesis that explains when and why quantum systems thermalize internally..
To give you a touchstone, let me present a simple example of a system with noncom muting charges. Imagine a chain of qubits, where each qubit interacts with its nearest and next-nearest Neighbours, such as in the image below.
In this interaction, the qubits exchange quanta of spin angular momentum, forming what is known as a Heisenberg spin chain. This chain is characterized by three charges which are the total spin components in the x, y, and z directions, which I’ll refer to as Qx, Qy, and Qz, respectively. The Hamiltonian H conserves these charges, satisfying [H, Qa] = 0 for each a, and these three charges are non-commuting, [Qa, Qb] ≠ 0, for any pair a, b ∈ {x,y,z} where a≠b. It’s noteworthy that Hamiltonians can be constructed to transport various other kinds of noncom muting charges. I have discussed the procedure to do so in more detail here (to summarize that post: it essentially involves constructing a Koi pond).
This is the first in a series of blog posts where I will highlight key elements discussed in the perspective article. Motivated by requests from peers for a streamlined introduction to the subject, I’ve designed this series specifically for a target audience: graduate students in physics. Additionally, I’m gearing up to defending my PhD thesis on noncom muting-charge physics next semester and these blog posts will double as a fun way to prepare for that
The First Attempt
In the following section, I refer to a ‘game state’ as any unique arrangement of X’s and O’s on a board. The ‘empty game state’ simply means an empty board. ‘Traversing’ through a certain game state means that, at some point in the game, that game state occurs. So, for example, every game traverses through the empty game state, since every game starts with an empty board.
In order to solve the unsolved, one must first solve the solved. As such, my first attempt was to create an algorithm that would figure out the best move to play in regular tic-tac-toe. This first attempt was rather straightforward, and I will explain it here:
Essentially, I developed a model using what is known as “reinforcement learning” to determine the best next move given a certain game state. Here is how it works: To track which set of moves are best for player X and player O, respectively, every game state is assigned a value, initially 0. When a game ends, these values are updated to reflect who won. The more games are played, the better these values reflect the sequence of moves that X and O must make to win or tie. To train this model (machine learning parlance for the algorithm that updates the values/parameters mentioned above), I programmed the computer to play randomly chosen moves for X and O, until the game ended. If, say, player X won, then the value of every game state traversed was increased by 1 to indicate that X was favored. On the other hand, if player O won, then the value of every game state traversed was decreased by 1 to indicate that O was favored. Here is an example:
.