One of the most relevant open questions currently under study in physics is how a classical world emerges from the laws of quantum mechanics. This can be resumed in a standard philosophical question:”How does reality form?”. We have learnt from standard quantum mechanics courses that one just takes the mathematical limit and the classical limit emerges from quantum mechanics. Indeed, as always, things are not that simple as in Nature is never zero and this means that all objects that are seen should be in a quantum mechanical state. So, why in our everyday life we never observe weird behaviors? What is that cuts out most of the Hilbert space states to maintain a systematic classical behavior in macroscopic objects? More carefully stated the question can be put as:”Where is the classical-quantum border, if any?”. Indeed, if we are able to draw such a border we can buy all the Copenaghen interpretation and be happy.

A proposal that is going to meet increasing agreement is environmental decoherence. In this case one assumes that an external agent does the job erasing all the quantum behavior of a system and leaving only a superselected set of pointer states that grants a classical behavior. From a mathematical standpoint one can see interference terms disappear but one cannot say what is the exact state the system is left on, leaving in some way the measurement problem unsolved. It should be said that the starting point of this approach has been a pioneering work of Caldeira and Leggett that firstly studied quantum dissipation. In order to have an idea of how environmental decoherence should work, the Einstein’s question: “Do you really believe that the moon is not there when we do not look at it?” is just answered through the external effect of sun radiation and, maybe, the cosmic microwave background radiation that should act as a constant localizing agent even if I have never seen such an interpretation in the current literature. It is clear that to try to explain a physical effect by a third undefined agent is somewhat questionable and sometime happens to read in literature some applications of this idea without a real understanding of what such an agent should be. This happens typically in cosmology where emergence of classicality cannot be easily understood. As unsatisfactory may be such an approach can be seen from the relevant conclusion that it gives strong support to a multiverse interpretation of quantum mechanics. Of course, this can be a welcomed conclusion for string theorists.

Today on arxiv a preprint is appeared by Steven Weinstein of Perimeter Institute that proves a theorem showing that environmental decoherence cannot be effective in producing classical states from generic quantum states. Indeed, all applications of environmental decoherence in literature consider just well built toy models that, due to their construction, produce classicality but this behavior is not generic even if in agreement with the theorem proved by Weinstein. So, the question is still there unanswered:”Where is the classical-quantum border, if any?”. We have already seen here the thermodynamic limit, that is the border at infinity, is an answer, but the question requires a deep experimental investigation. A hint of this was seen in interference experiments with large molecules by Zeilinger’s group. This group is not producing any new result about since 2003 but their latest paper was showing some kind of blurry behavior with larger molecules. This effect has not been confirmed and one cannot say if it was just a problem of the apparatus.

The question is still there and we can state it as:”How does reality form?”.

I don’t find Weinstein’s argument too convincing. He takes an average of distinguishability (between a trivial mixed state for S and a projection of a pure state for S+E to S) over all pure states of S+E, and finds that the projected state is likely close to what would be obtained if we instead computed the distinguishability for the average of all pure states of S+E (i.e., a trivial mixed state, which would give us a trivial mixed state for S). Why should we take a pure state to be characteristic of a system+thermal environment? The dynamical argument that follows exploits the evolution of the pure state for S+E into another pure state, so the assumption that a pure state for S+E is a good model is fundamental to his argument.

The usual model, with S in a pure state initially, E in a thermal mixed state, is of course idealized, but Weinstein’s model is

verydifferent (it’snota refinement), and how is it better? Weinstein’s point in his discussion, that a product state is highly non-generic, can as well be applied to his use of a pure state for S+E. It seems significant that distinguishability is anon-linearfunction of two density matrices, so that averaging over all possible mixed states for S+E may not give the same result. Even more, averaging should be over all mixed states consistent with data for a particular experiment, then I very much doubt he can obtain a result that is anywhere near as straightforward.Hi Peter,

I think this is a main concern for environmental decoherence and, indeed, if the starting assumpiton is that S+E is a pure state this is what should be. The reason for this relies on the fact thta environmental decoherence implies that if you take environment and state together this should behave quantum mechanically again as you know that the missing information about the state is in the environment. You can take this argument as far as you like by adding a further “super-environment” that erases information to the environment and so on. So, I think that the hypothesis that S+E is a pure state can be acceptable. Indeed, this is another point about decoherence that I missed to cite.

Marco

So, if we claim that the universe is in a typical pure state (in a very large, 10^100, 10^1000, …, but not infinite dimensional Hilbert space), restriction of that state to a finite number of degrees of freedom is almost always not distinguishable from the identity. Haven’t we just proved by contradiction that the universe cannot be in a typical pure state, since in fact we find that the state of almost every subsystem around us is distinguishable from the identity?

If we restrict the pure states we average over to some non-typical set of pure states (a set of measure zero in the set of pure states), consistent with the states of the subsystems we observe, there would be some subsystems (the ones we observe, by construction) that would have states distinguishable from the identity. So, the universe is in an atypical pure state, not in a typical pure state?

Peter,

I think you are entering in a dangerous turf. How do you compute your wave function of the universe? Do you solve Wheeler-deWitt equation? Do you consider a string theory? What else? In any case you will get a perfectly quantum state that will evolve with a quantum unitary evolution operator? What will makes all this classical as we currently observe whatever is the state you consider?

Sorry I introduced the state of the universe, which is not a big deal to me (except that I can’t see why we would claim on a priori grounds that anything less than the whole universe should be in a pure state). Just say that there is a very large Hilbert space. Suppose we carry out several experiments, each of which fixes the density matrix associated with a corresponding subspace of the large Hilbert space. The state of the large system should be such that reductions to the subspaces are the observed states; that is, experiment constrains what the state of the large system might be. We can’t fix the state of the large system precisely without more experiments, but we had better not fix the state of the large system to be inconsistent with our experimental data. I think Weinstein does, by constraining the class of states allowed for the large system to be a typical pure state, which he then shows is inconsistent with

anynontrivial experimental result.I have no special affinity for decoherence, all I’m saying is that I’m not much taken by Weinstein’s analysis. I think an empiricist has no special need for decoherence, it’s enough for the probability densities predicted by a quantum theoretical model for an experiment to match the observed statistics of experimental data well enough.

I understand your point. But as far as I can tell with my experience with current literature, no definite statements are ever taken about the specific state of the environment. The only relevant thing you have to be sure is that it exists and yo can trace it away at the end of your computations. In any case, I agree with you that in order to make the argument stronger, a more generic state should be chosen for the environment. But whatever you take you cannot avoid a conceptual problem, that is infinite regression.

Marco