Abstracts

demon

pauliohne_gross

Talks will take 40min + 5min for questions.

Robert Alicki  -  Two types of information and thermodynamics

Fundamental differences between information encoded in unstable degrees of freedom and stable with respect to thermal noise are discussed within a new thermodynamical framework. It is argued that the thermodynamical cost of information processing depends on its stability. In particular, it sheds a new light on the Landauer's principle and the arguments involving Szilard's heat engine. Another consequence is the hypothesis that Quantum Information cannot be stabilized efficiently. The argument is supported by the Gedankenexperiment involving Szilard's engine based on quantum memory.

Janet Anders  -  Landauer’s erasure principle in the strongly coupled quantum regime

Several publications in recent years have discussed thermodynamic processes in strongly coupled quantum systems and claimed the violation of both, Landauer’s principle and the second law of thermodynamics. If true, this would have powerful consequences. Perpetuum mobiles could be built as long as the operating temperature is brought close to zero. This would also have serious consequences on  thermodynamic derivations of information theoretic results, such as the Holevo bound. I will review the original discussion on the model of a quantum brownian oscillator and argue why previous treatments are erroneous. It turns out that the established correlations in quantum systems at low temperatures require a rethink of how entropy, heat and work have to be calculated. I will  show that a consistent treatment  resolves the paradoxical situation.

References
[1] S. Hilt, J. Anders, E. Lutz, S. Shabbir, PRE (R) 83:030102 (2011);
[2] J. Anders, S. Shabbir, S. Hilt, E. Lutz, Elect. Proc. Comp. Sci. 26:13 (2010) (arXiv:1006.1420v1).

Charles Bennett  -  Forgetting and Erasing

Entanglement provides a coherent view of the physical origin of randomness and the growth and decay of correlations, even in macroscopic systems exhibiting few traditional quantum hallmarks. The most private information, exemplified by a quantum eraser experiment, exists only transiently: after the experiment is over no record remains anywhere in the universe of what "happened".  At the other extreme is information that has been so widely replicated as to be infeasible to conceal and unlikely to be forgotten.  But such durable information is exceptional: most macroscopic classical information---for example the pattern of drops in last week's rainfall---is impermanent, eventually becoming nearly as ambiguous, from a terrestrial perspective, as the transient result of a quantum eraser experiment.  Finally we discuss prerequisites for a system to accumulate and maintain in its present state, as our world does, a complex and redundant record of at least some features of its past. Not all dynamics and initial conditions lead to this behavior, and in those that do, the behavior itself tends to be temporary, with the system losing its memory, and even its classical character, as it relaxes to thermal equilibrium. Finally we discuss the thermodynamics of computation, and the role of Landauer's principle as an avatar of the second law of thermodynamics.  

Jens Eisert  -  A quantum information view on equilibration and the emergence of statistical ensembles

(overview talk)
This talk will be concerned with recent progress on understanding how quantum many-body systems out of equilibrium eventually come to rest. The first part of the talk will highlight theoretical progress on this question - employing ideas of Lieb-Robinson bounds, quantum central limit theorems and of concentration of measure [1-4]. These findings will be complemented by experimental work with ultra-cold atoms in optical lattices, constituting a dynamical "quantum simulator", allowing to probe physical questions that are presently out of reach even for state-of-the-art numerical techniques based on matrix-product states [5]. The last part of the talk will sketch how based on the above ideas, a fully certifiable quantum algorithm preparing Gibbs states can be constructed, complementing quantum Metropolis algorithms [6].

References
[1] "Absence of thermalization in non-integrable systems", Phys. Rev. Lett. 106, 040401 (2011).
[2] "Concentration of measure for quantum states with a fixed expectation value", Commun. Math. Phys. 303, 785 (2011).
[3] "A quantum central limit theorem for non-equilibrium systems: Exact local relaxation of correlated states", New J. Phys. 12, 055020 (2010).
[4] "Exact relaxation in a class of non-equilibrium quantum lattice systems", Phys. Rev. Lett. 100, 030602 (2008).
[5] "Probing the relaxation of a strongly correlated 1D Bose gas towards equilibrium", submitted to Nature Physics (2011).
[6] "Gibbs states, exact thermalization of quantum systems and a certifiable algorithm for preparing thermal states", arXiv:1102.2389, submitted to Phys. Rev. Lett. (2011).

Jochen Gemmer  -  A transient fluctuation theorem in closed quantum systems

Our point of departure are the unitary  dynamics of closed quantum systems as generated from the Schroedinger equation. We focus on a class of quantum models that typically exhibit roughly exponential relaxation of some observable within this framework. Furthermore we focus on pure state evolutions. An entropy in accord with Jaynes principle is defined on the basis of the quantum expectation value of the above observable. It is demonstrated that the resulting deterministic entropy dynamics are
in a sense in accord with a transient fluctuation theorem. Moreover we demonstrate that the dynamics of the expectation value may be described in terms of an Ornstein-Uhlenbeck process. These findings are demonstrated numerically and supported by analytical considertions based on quantum typicality.

Christian Gogolin  -  Thermalization in nature and on a quantum computer   

Using the assumption that thermodynamic systems evolve towards Gibbs states, i.e. states with a well defined temperature, statistical mechanics and thermodynamics have been amazingly successful in explaining a wide range of physical phenomena. In stark contrast to this strong justification by corroboration of these theories, the question of whether and how the methods of statistical mechanics and thermodynamics can be justified microscopically was still wide open until recently. With new mathematical tools from quantum information theory becoming available, there has been a renewed effort to settle this old question. I will present and discuss a necessary and a sufficient condition for the emergence of Gibbs states from the unitary dynamics of quantum mechanics and show how these new insights into the process of equilibration and thermalization can be used to design a quantum algorithm that prepares thermal states on a quantum computer/simulator.

References
[1] A. Riera, C. Gogolin, J. Eisert, http://arxiv.org/abs/1102.2389
[2] C. Gogolin, M. P. Müller, and J. Eisert, Phys. Rev. Lett. 106, 040401 (2011) http://arxiv.org/abs/1009.2493
[3] C. Gogolin, http://arxiv.org/abs/1003.5058
[4] C. Gogolin, Phys. Rev. E 81, 051127 (2010), http://arxiv.org/abs/0908.2921

Michał Horodecki  -  TBA

I will give a brief review of the literature on TBA and present a couple of new results.

Dominik Janzing  -  Three steps towards an entirely quantum description of thermodynamic machines

In the first step, I consider heat engines and refrigerators where the hot and cold reservoirs are quantum, but the energy sink or source is a classical system implementing a non-energy conserving unitary process [1].  These simple toy models teach thermodynamics from a modern perspective and nicely show the complexity of optimal thermodynamic ``machines''.

In the second step the energy sink/source is also quantum and the classical controller is only allowed to implement transformations that commute with the joint Hamiltonian of reservoirs and energy sink/source. This way, we can classify quantum states with respect to their  ability to energetically drive the preparation of others [2]. Within such a model, perfect cooling or bit erasure requires infinitely many thermal resources unless the cold reservoir  has already temperature zero.

Third, I describe a  model where also the controller is quantum and the world is just a Hamiltonian system or cellular automaton. I introduce the concept of a ``physically universal'' Hamiltonian or CA (as opposed to computationally universal) which has the property that any desired control operation on a finite region can be implemented by appropriately initializing its complement. The existence of such a physically universal Hamiltonians or CA (which I pose as an open question) would show that the boundary between a system and its controller could be consistently shifted. I argue that discussing fundamental thermodynamic constraints requires models of this kind. This part of my talk will mainly contain questions, but I tried to phrase them as well-defined mathematical problems. 

References
[1] Janzing: On the computational power of molecular heat engines, J. Stat. Phys. (2005)
[2] Janzing, Wocjan, Zeier, Geiss, Beth: Thermodynamic cost of reliability and low temperatures: Tightening Landauer's principle and the Second Law, J. Theor. Phys. (2000)
[3] Janzing: Is there a physically universal cellular automaton or Hamiltonian? arXiv:1009.1720v1  (2010)

Sania Jevtic  -  Thermodynamics in the presence of correlations

We investigate how the local thermodynamical properties of the quantum systems are changed if the systems are correlated. By looking at some simple thermodynamic scenarios, which include a model for heat flow model between two systems and the use of a Szilard engine to extract work from a bipartite state, we show that correlations can make heat flow from cold to hot and produce more work, both at constant average energy. To put all this into perspective, we introduce a situation in which a global Maxwell demon can confuse a local observer when the system they are measuring possesses correlations: The local observer sees a reversal of their local arrow of time and an apparent violation of the second law.

Markus Mueller  -  Typical entanglement, coin tossing, and a general-probabilistic decoupling theorem

One of the main paradigms of statistical physics is the idea that small systems (S) tend to "randomize" if they are coupled to a large environment (E). An instance of this phenomenon is the fact that bipartite quantum states on SB are typically almost  maximally entangled, which has recently been discussed as a possible justification of  the postulate of equal a priori probabilities [1].
An apparently very different, but familiar, phenomenon is coin tossing: if a classical coin (S) is tossed in an unknown environment (E), the coin's state itself usually becomes unknown. In the talk, I report on joint work with Oscar Dahlsten and Vlatko Vedral, where we show that both examples can be viewed as instances of a more general phenomenon. We give a unifying formula, valid for a large class of probabilistic theories, which quantifies the expected amount of subsystem randomization in terms of two simple properties of S and E (their capacities and dimensions).

To this end, we generalize the notion of "purity" and the decoupling theorem to probabilistic theories. In joint work with Jonathan Oppenheim, we apply this to the black hole information paradox, generalizing the analysis in [2] to the case of conceivable theories beyond quantum mechanics. We also get some new results on typical quantum entanglement as a by-product.

References
[1]  S. Popescu, A. Short, A. Winter, Entanglement and the foundations of statistical mechanics, Nature Physics 2, 754-758 (2006).
[2] P. Hayden and J. Preskill, Black holes as mirrors: quantum information in random subsystems, Journal of High Energy Physics, 09120 (2007)
[3] M. Müller, O. Dahlsten, V. Vedral, arXiv:1107.???

Jonathan Oppenheim  -  Thermodynamics as a resource theory (cont.)

Some preliminary results on this idea (continuation of Rob's talk).

Jochen Rau  -  Inferring constants of the motion of a small quantum system

In addition to the tomography of quantum states and processes there is a further tomography task that is not usually considered: Whenever the precise dynamics of a many-body quantum system is not known, its constants of the motion must be inferred from (possibly imperfect) measurement data. Operationally, this can be done by preparing multiple samples of the system in randomly varying initial states, and letting these samples evolve independently to their respective equilibrium states. Subsequent state tomography on all samples is then expected to reveal that, modulo random fluctuations, the final equilibrium states are distributed on some low-dimensional submanifold of state space. This submanifold is composed of Gibbs states determined by the constants of the motion.

Whenever the number of different samples or their sizes are small, however, the reconstruction of the Gibbs manifold (and hence of the constants of the motion) is a nontrivial inference task. In purely statistical terms, this is a situation where data in some high-dimensional space (the tomographic images in state space) are presumed to be explained by a small number of latent variables (expectation values of the constants of the motion), effectively reducing the dimensionality of the data. In such a generic setting, the task is to infer the optimal dimension and orientation of the lower-dimensional latent space. Problems of this type can be tackled with a variety of statistical techniques such as factor analysis or principal component analysis. In my talk, I present a statistical framework which draws inspiration from these generic techniques, yet is tailored to the specific task of inferring the constants of the motion from imperfect measurement data.

References
[1] J. Rau, Phys. Rev. A 84, 012101 (2011)
[2] J. Rau, arXiv:1103.2803

Peter Reimann  -  Equilibration and thermalization under realistic preparation and measurement conditions   

We demonstrate equilibration of isolated macroscopic quantum systems, prepared in non-equilibrium mixed states with significant population of many energy levels, and observed by instruments with a reasonably bound working range compared to the resolution limit. Both properties are fulfilled under many, if not all, experimentally realistic conditions. A generic modelling of preparation and measurement implies thermalization in accordance with a  conjecture by Asher Perez, which is related to, but weaker than the so-called eigenstate thermalization hypothesis (ETH).

Augusto Roncaglia    A system equilibrates when diagonalizing its Hamiltonian is difficult   

In classical mechanics there is a relation between integrability and equilibration, but this is not well understood in the quantum case. Closed quantum systems never equilibrate to a stationary state. However, in some circumstances, the system equilibrates locally (the reduced density matrix of a subsystem evolves to a stationary state). All finite-dimensional quantum systems are integrable, in the sense that solving its dynamics reduces to diagonalizing its hamiltonian. This procedure may need a large computational effort, which differs from system to system. In some sense, the lack of integrability of a quantum system can be quantified by the computational complexity of diagonalizing its hamiltonian. We show that, if this complexity is at least quadratic with the size of the system then local equilibration holds (for almost all hamiltonians); and if this complexity is sub-linear then local equilibration does not hold.

Benjamin Schumacher  -  Landauer's principle, fluctuations and the Second Law

As Bennett showed, Maxwell's demon cannot achieve a violation of the Second Law of thermodynamics because of the thermodynamic cost of information erasure (known as Landauer's principle).  This suggests a new statement of the Second Law, one that is provably equivalent to more familiar versions:  No process can have as its sole result the erasure of information.  Recent work by Sagawa and Ueda on generalized Jarzynski equalities further clarifies the statistical behavior of information engines [1].

References

[1] T. Sagawa and M. Ueda, Generalized Jarzynski Equality under Nonequilibrium Feedback Control, arxiv:0907.4914 (2009).

Paul Skrzypczyk  -  Small thermal machines: virtual qubits and virtual temperatures

I will explain how the perspective of 'virtual qubits' at 'virtual temperatures' can be used to understand how thermal machines function, focusing primarily on the simplest  situation, that of the smallest thermal machines. The two baths necessary to create a thermal machine, if viewed as a single, composite system, contain two-level subsystems (virtual qubits) which have temperatures which can take on any value -  positive or negative. Thermal machines place external systems in thermal contact with a filtered range of virtual qubits - with the filtered (virtual) temperatures determining the thermodynamic behaviour. I will discuss what we can say from this perspective about the Carnot efficiency, entropy production and work.

Rob Spekkens  -  Thermodynamics as a resource theory

Introduction to this idea (leading up to Jonathan's talk).

Masahito Ueda  -  Information Thermodynamics: Maxwell’s Demon and Quantum Szilard Engine   

The second law of thermodynamics presupposes a clear-cut distinction between the controllable and uncontrollable degrees of freedom by means of macroscopic operations. The state-of-the-art technologies in quantum information and nano-science seem to force us to abandon such a notion in favor of the distinction between the accessible and inaccessible degrees of freedom. In this talk, I will discuss the implications of this paradigm shift by focusing on how the second law of thermodynamics can be generalized in the presence of a feedback control [1]. I will also discuss the minimum work required for measurement and erasure of information [2]. The Jarzynski equality has to be generalized in the presence of feedback control [3], as confirmed experimentally using polystyrene beads [4]. A quantum generalization of the Szilard engine will also be discussed [5].

References
[1] T. Sagawa and M. Ueda, Phys. Rev. Lett. 100, 080403 (2008).
[2] T. Sagawa and M. Ueda, Phys. Rev. Lett. 102, 250602 (2009); 106, 189901 (2011).
[3] T. Sagawa and M. Ueda, Phys. Rev. Lett. 104, 090602 (2010).
[4] S. Toyabe, T. Sagawa, M. Ueda, E. Muneyuki, and M. Sano, Nature Physics, 6, 988 (2010).
[5] S. W. Kim, T. Sagawa, S. De Liberato, and M. Ueda, Phys. Rev. Lett. 106, 070401 (2011).

Joan Vaccaro  -  Erasure of information under conservation laws   

Heat engines and the erasure of information derive from the same fundamental principle: maximisation of entropy subject to conservation of energy. Landauer argued that the process of erasing the information stored in a memory device incurs an energy cost in the form of a minimum amount of mechanical work. If other laws constrain the erasure process there may be costs additional to energy. Here we derive the cost of erasing information when an additional conservation law, such as angular momentum, must be satisfied. We find that the Landauer's minimum work needs to be supplemented by a cost in terms of angular momentum. We also find that the energy cost can be reduced to zero leaving only the angular momentum cost. Using this to erase the memory of Maxwell's demon implies that work can be extracted from a single thermal reservoir at a cost of angular momentum and an increase in total entropy.

References

[1] J.A. Vaccaro and S.M. Barnett, "Information erasure without an energy cost", Proc. R. Soc. Lond. A 467, 1770-1778 (2011) DOI [arXiv:1004.5330].

Stephanie Wehner  -  Understanding thermalization from decoupling

First, we show how major results obtained on the foundations of statistical physics can not only be reproduced, but generalized and extended by use of the Decoupling Theorem. As an example, we extend recent result that justify the use of the postulate of equal a priori probability. In particular, our results allow us replace the choice of random states (Haar measure) of the system and the environment, by a statement about all states after a sufficient amount of random two qubit interactions.

Second, we consider what happens when we bring a particular system in contact with an environment. How long does it take until the system has forgotten about its initial state? Our results thereby make statements about the actual state of the system at any given time, instead of statements about temporal averages. In particular, we show that for almost all states of the system, the time that it takes for them to become independent from their initial state can be understood by analyzing an entropic inequality involving just a single state at time t. Finally, we show that for almost all states of the environment, a quantum mechanical system stays close to its initial state for all times if the relevant eigenstates of the system-environment Hamiltonian are sufficiently poorly entangled.

References

[1] F. Dupuis, M. Berta, J. Wullschleger, R. Renner, The Decoupling Theorem, arXiv:1012.6044v1 (2010).
[2] S. Popescu, A. Short, A. Winter, Entanglement and the foundations of statistical mechanics, Nature Physics 2, 754-758 (2006).
[3] A.W. Harrow, R.A. Low, Random Quantum Circuits are Approximate 2-designs, Communications in Mathematical Physics 291, 257302 (2009).
[4] N. Linden, S. Popescu, A. Short, A. Winter, Quantum mechanical evolution towards thermal equilibrium, Phys. Rev. E 79:061103 (2009).
[5] C. Gogolin, M. P. Müller, J. Eisert, Absence of Thermalization in Nonintegrable Systems, Phys. Rev. Lett. 106, 040401 (2011)

[6] Adrian Hutter, Stephanie Wehner et al., Several papers to appear on the arXiv soon.