## Turing machine and Landauer limit

04/06/2012

My inactivity period was due to a lack of real news around the World. But I was not inactive at all. My friend Alfonso Farina presented to me another question that occupied my mind for the last weeks: What is the energy cost for computation? The first name that comes to mind in such a case is Rolf Landauer that, on 1961, wrote a fundamental paper on this question. The main conclusion drawn by Landauer was that at each operation on a bit there is an entropy cost of $K\ln 2$ being $K$ the Boltzmann constant. This means that, it you are operating at a temperature $T$ there will be heat emission for $KT\ln 2$ and this is the Landauer limit. This idea stems from the fact that information is not some abstract entity living in hyperuranium but just to stay in the real world it needs a physical support. And wherever there is a physical support thermodynamics and its second principle is there at work. Otherwise, we can use information to evade the second principle and build our preferred perpetual motion. As Charles Bennett proved, Maxwell demon cannot work due to Landauer limit (for a review see here).

Recently, a group of researchers was able to show, by a smart experimental setup, that Landauer’s principle is indeed true (see here). This makes mandatory to show theoretically that Landauer’s principle is indeed a theorem and not just a conjecture.

To accomplish this task, we would need a conceptual tool that can map computation theory to physics. This tool exists since a long time and was devised by Alan Turing: The Turing machine. A Turing machine is a thought computational device aimed to show that there exist mathematical functions that cannot have a finite time computation, a question asked by Hilbert on 1928 (see here). A Turing machine can compute whatever a real machine can (this is the content of the Church-Turing thesis). There exist some different kinds of Turing machines but all are able to perform the same computations. The main difference relies on the complexity of the computation itself rather than its realization. This conceptual tool is now an everyday tool in computation theory to perform demonstrations of fundamental results. So, if we are able to remap a Turing machine on a physical system and determine its entropy we can move the Landauer’s principle from a conjecture to a theorem status.

In my paper that appeared today on arXiv (see here) I was able to show that such a map exists. But how can we visualize it? So, consider a Turing machine with two symbols and a probabilistic rule to move it. The probabilistic rule is just coded on another tape that can be consulted to take the next move. This represents a two-symbol probabilistic Turing machine. In physics we have such a system and is very well-known: The Ising model. As stated above, a probabilistic Turing machine can perform any kind of computations a deterministic Turing machine can. What is changing is the complexity of the computation itself (see here). Indeed, a sequence of symbols of the tape in the Turing machine is exactly a configuration of a one-dimensional Ising model. This model has no critical temperature and any configuration is a plausible outcome of a computation of a Turing machine or its input. What we need is a proper time evolution that sets in the equilibrium state, representing the end of computation.

Time evolution of the one-dimensional Ising model has been formulated by Roy Glauber on 1963. Glauber model has a master equation that converge to an equilibrium as time evolves with a Boltzmann distribution. The entropy of the model at the end of its evolution is well-known and has the limit value for the entropy $K\ln 2$ as it should when a single particle is considered but this is just a lower limit. So, we can conclude that the operations of our Turing machine will involve a quantity of emitted heat in agreement with Landauer’s principle and this is now a theorem. What is interesting to note is that the emitted heat at room temperature for a petabit of data is just about a millionth of Joule, a very small amount. This makes managing information convenient yet and cybercrime still easy to perform.

Landauer, R. (1961). Irreversibility and Heat Generation in the Computing Process IBM Journal of Research and Development, 5 (3), 183-191 DOI: 10.1147/rd.53.0183

Bennett, C. (2003). Notes on Landauer’s principle, reversible computation, and Maxwell’s Demon Studies In History and Philosophy of Science Part B: Studies In History and Philosophy of Modern Physics, 34 (3), 501-510 DOI: 10.1016/S1355-2198(03)00039-X

Bérut, A., Arakelyan, A., Petrosyan, A., Ciliberto, S., Dillenschneider, R., & Lutz, E. (2012). Experimental verification of Landauer’s principle linking information and thermodynamics Nature, 483 (7388), 187-189 DOI: 10.1038/nature10872

Marco Frasca (2012). Probabilistic Turing Machine and Landauer Limit arXiv arXiv: 1206.0207v1

## On the arrow of time again

10/09/2009

Lorenzo Maccone’s argument (see here)  is on the hot list yet. Today, a paper by David Jennings and Terry Rudolph (Imperial College, London) appeared (see here) claiming Maccone’s argument being incorrect. Indeed, they write down Maccone’s argument as follows

Any decrease in entropy of a system that is correlated with an observer entails a memory erasure of said observer

but this erasure is provided by quantum correlations. The key point is the link between quantum correlations and local decrease of entropy as seen by classical correlations. Jennings and Rudolph interpret Maccone’s view as the reduction of information at a quantum level entails a reduction of information at a classical level and we do not observe such events. These authors show counterexamples where this does not happen arguing that Maccone’s argument does not explain rather worsens the problem as quantum correlations can decrease while classical ones can increase.

I guess that this comment will undergo the standard procedure of Physical Review Letters for it and Lorenzo Maccone will produce a counterargument facing in this way a review process. As it stand, it appears a substantial open problem to the original Maccone’s proposal but relies in an essential way on the interpretation Jennings and Rudolph attach to it.

Being this a really exciting matter, it will be really interesting to following the way events will take place.

## The question of the arrow of time

31/08/2009

A recent paper by Lorenzo Maccone on Physical Review Letters (see here) has produced some fuss around. He tries to solve the question of the arrow of time from a quantum standpoint. Lorenzo is currently a visiting researcher at MIT and, together with Vittorio Giovannetti and Seth Lloyd, he produced several important works in the area of quantum mechanics and its foundations. I have had the luck to meet him in a conference at Gargnano on the Garda lake together with Vittorio. So, it is not a surprise to see this paper of him in an attempt to solve one of the outstanding problems of physics.

The question of the arrow of time is open yet. Indeed, one can think that Boltzmann’s H-theorem closed this question definitely but this is false. This theorem has been the starting point for a question yet to be settled. Indeed, Boltzmann presented a first version of his theorem that showed one of the most beautiful laws in physics: the relation between entropy and probability. This proof was criticized by Loschmidt (see here) and this criticism was sound. Indeed, Boltzmann had to modifiy his proof by introducing the so called Stosszahlansatz or molecular chaos hypothesis introducing in this way time asymmetry by hand.  Of course, we know for certain that this theorem is true and so, also the hypothesis of molecular chaos must be true. So, the question of the arrow of time will be solved only when we will know where molecular chaos comes from. This means that we need a mechanism, a quantum one, to explain Boltzmann’s hypothesis. It is important to emphasize that, till today, a proof does not exist of the H-theorem that removes such an assumption.

Quantum mechanics is the answer to this situation and this can be so if we knew how reality forms. An important role in this direction could be given by environmental decoherence and how it relates to the question of the collapse. A collapse grants immediately asymmetry in time and here one has to cope with many-body physics with a very large number of components. In this respect there exists a beautiful theorem by Elliot Lieb and Barry Simon, two of the most prominent living mathematical-physicists, that says:

Thomas-Fermi model is the limit of quantum theory when the number of particles goes to infinity.

For a more precise statement you can look at Review of Modern Physics page 620ff. Thomas-Fermi model is just a semi-classical model and this just means that this fundamental theorem can be simply restated as saying that the limit of a very large number of particles in quantum mechanics is the classical world. In some way, there exists a large number of Hamiltonians in quantum mechanics that are not stable with  respect to such a particle limit losing quantum coherence. For certain we know that there exist other situations where quantum coherence is kept at a large extent in many-body systems. This would mean that exist situations where quantum fluctuations are not damped out with increasing number of particles.  But the very existence of this effect implied in the Lieb and Simon theorem means that quantum mechanics has an internal mechanism producing time-asymmetry. This, together with environmental decoherence (e.g. the box containing a gas is classical and so on), should grant a fully understanding of the situation at hand.

Finally, we can say that Maccone’s attempt, being on this line of thought, is a genuine way to understand from quantum mechanics the origin of time-asymmetry. I hope his ideas will meet with luck.

Update: In Cosmic Variance you will find an interesting post and worthwhile to read discussion involving Sean Carroll, Lorenzo Maccone and others on the questions opened with Lorenzo’s paper.