A problem that I have treated in this blog is the question of the quantum-classical transition. This question is hotly debated by people working in quantum optics, quantum computation and wherever foundations of quantum mechanics may enter. Of course, this problem today appears far from being settled and is a heavy burden left us by the fathers of quantum mechanics. Something has been acquired as environmental decoherence. Fighting this effect is a problem experimentalists have today in their everyday activity. But we know that this cannot be all the story.

Some time ago Wojciech Hubert Zurek, one of the main contributors to environmental decoherence, claimed that Hyperion, a Saturn’s moon, behaves classically in its motion just for environmental decoherence otherwise we would observe a macroscopic quantum object splashed in its orbit as happens to electrons in an atom. Of course some people contested these conclusions and come out with a sound explanation of classicality of Hyperion’s motion without the need of environmental decoherence. One of these authors is Leslie Ballentine. I think that a lot of people have read his beautiful book about quantum mechanics. Ballentine and Nathan Wiebe wrote a paper (see here), that went published on Physical Review A (see here), where they soundly proved that Hyperion behaves classically without recurring to any kind of external agent. In some way they gave an hint of an intrinsic emerging of classicality for macroscopic objects (“for all practical purposes” as John Bell taught us). This means that classicality may be an emerging property of quantum objects.

Of course, defenders of environmental decoherence tried to attack Ballentine and Wiebe view (see here). Ballentine’s answer is here. This gives a lucid view of the present criticisms to environmental decoherence, that I would like to recall it is a true observed effect, claiming an intrinsic decoherence effect for isolated quantum systems. The last word has not been said yet. Future experiments will say.

Most of the supporters of environmental decoherence share Ballentine’s views as are well aware of the limitations of this approach. This is part of the truth. I think that here is in view some new deep understanding of how reality forms. Some subtlities are implied and this can explain difficulties researchers have currently met.

This entry was posted on Tuesday, March 10th, 2009 at 11:23 am and is filed under Physics. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.

17 Responses to Ballentine and the decoherence program

could you please tell me a rational reason why you think that decoherence – and its correct derivation of the classical/quantum boundary – is not settled?

I think it is completely settled – and the only sense in which it is not settled at the sociological level is that there exist many bigots who don’t want to accept the established truth.

I think one can easily prove a theorem that shows that a quantum system with N particles, when N is taken to go to infinity, recovers the semiclassical limit (Thomas-Fermi approximation). This is true independently on the presence of an external agent. But the question is not if environmental decoherence is true, it is true because experiments say so, but if it is all one needs to explain quantum-classical transition.

The point is: FAPP the cat inside the Schroedinger box can be considered classical? If some kind of intrinsic effect exists that does the transition the answer is yes and the ambiguity removed. What is not been settled is if such an effect does exist. There are some evidences in NMR experiments that are quite compelling. Anyhow, a first step would be to recognize that there is a scientific problem here to give an answer.

the reason why your assertion can’t be “easily proved” is that the assertion is untrue.

It is simply not true that an “infinite number of particles”, whatever is the definition of a particle in a general system, is enough for classical description to capture all the key dynamics of the system.

A cool enough material may have trillions of atoms but its states are still classified in terms of fully coherent quantum phonons.

Decoherence actually calculates exactly what is the required limit or inequality in which quantum coherence disappears and the classical framework becomes legitimate. It is not true that the inequality is “N is greater than one”. The point where quantum phenomena disappear depends on the temperature, the strength of interactions, the speed of processes, the geometric shape of objects, the strength of the interactions with the environment, the amount of stuff in the environment, and the extent to which locality is respected by the interactions, among other things.

It is a complex dynamical question. This classical/quantum boundary depends on the Hamiltonian – it’s the whole point of decoherence – and you are completely missing every part of this point you can conceivably miss.

Concerning completeness, how could possibly a complete theory that tells you the location of the quantum/classical crossover for every situation in every physical system one can write down – how could it be incomplete? What is exactly missing? The statement that it’s incomplete makes no sense whatsoever.

The question whether a cat can be considered “classical” is not sufficiently well-defined. Some aspects are well enough described by classical concepts, others need quantum mechanics – like the stability of atoms in the cat and the absorption spectrum of the cat or whatever.

What makes sense in physics is to ask what are the probabilities that something occurs with the cat. For certain simple macroscopic processes with the cat, classical (statistical) physics agrees with quantum mechanics. In the realm where quantum interference, classical approximations are inapplicable.

I have no idea what you’re saying about NMR. All actual observable questions about NMR are fully answered by quantum mechanics, and they have been understood for decades (and their principle for 80 years). There is no open problem here. What you seem to be unwilling to accept is the very fact that quantum mechanics is actually true and the right description for *every* system, whether it is large or small, and classical physics is always just an approximation that holds in certain regimes but not others.

Otherwise, the Thomas-Fermi approximation is just a trivial, nearly tautological approximation assuming that the density of levels is large, which can be easily seen for conventional many-body systems, so that the continuum is a fine description. It’s explained e.g. here

Believe me that it is way more sensible a reference than all the hyperlinks you have included, and it is much wiser to study the words of the people who actually understand something rather than to impress yourself with ignorance promoted to a virtue.

Also, a general great paper about interpretations of QM is one by Omnes:

The following theorem by Lieb and Simon does hold:

“The limit of number of particles N going
to infinity for the energy functional of N particles interacting with Coulombic forces is the Thomas-Fermi model”

Check this: E. Lieb, Rev. Mod. Phys. 53, 603-641 (1981), pag. 620ff. Thomas-Fermi approximation is a semiclassical approximation. You should combine both these results. These facts are well-known to nuclear physicists.

What is the question about macroscopic coherent states? These states can be seen to evolve following Wigner-Poisson set of equations that have as a limit Thomas-Fermi dynamical equations. So, you can study the stability of the above result. You will find that this will depend on the way you prepare your system. Classical states appear as the most generic ones and they are stable. As you can see is all well coded.

Of course I am a followers of environmental decoherence simply because is true. But I would like to understand if it is truly the last word about this matter. If you are able to give an answer to this messy situation, you should write down a paper and send it to PRA because there are a lot of physicists awaiting for the final answer on these questions. You know better than the me that faith by its own cannot be enough to satisfy our needs.

Quoting Ballentine: “Schlosshauer raises some important questions about the emergence of classical behavior from quantum mechanics (QM) and the appropriate criterion for effective classicality. These are related to the broader issues of the meaning and interpretation of the state vector in QM, and to the narrower issue of the role of decoherence in these problems.”

Attempts to interpret the QM state vector generally infer that an eigenvector state yields only one measurement result. For example, state vector |c_2> (cat dead) means that a measurement on the cat will yield a dead cat result. Such situations are valid only in idealized (classical) situations where the elementary particles are classically point-like. Reality is different, when I measure the position of an extended particle |x_1>, the result will be approximatively x_1 but hardly never exactly x_1.

So in order to interpret correctly quantum mechanics, I think that one has to leave aside classical ‘result=eigenvalue’ attempts and focus instead on analogies where macroscopic objects can be treated quantum-mechanically, like rotating needles or rods. Classical behavior than emerges from statistical ensembles of such objects.

concerning your question about completeness, is there an operational scientific – non-sociological – way to formulate your question? What would happen if the description of the emergence of the classical world were found “incomplete”?

Would it allow one to calculate something new, at least in principle, i.e. some results XY from some initial data AB? Because I don’t see any yet non-falsified possible addition to the scheme, it is complete according to my taste. Now, no theory in physics can be proved a complete theory of everything, but merely insisting that something is incomplete, without knowing what could be missing, is not a constructive approach, and the fact that it is applied so eagerly in this case and not others is a proof of bias of the dissatisfied people.

For Arjen: quite on the contrary. In Schrödinger cat’s thought experiments, one talks about two states for the sake of simplicity. There are many “classical” states that a cat can be found in: they are labeled by many local “classical” degrees of freedom. But the conclusion is unchanged from the 2-state toy model: almost all of the linear superpositions are “unphysical” as final detectable states.

I said “quite on the contrary” because the “result=eigenvalue” paradigm is a completely general and valid point about the real world because it is really a postulate of quantum mechanics and the latter is universally valid. For example, if you ask a Yes/No question, in the real world, it is always represented by a projection operators (with eigenvalues 0/1, meaning No/Yes), and QM always predicts the probabilities of Yes and No, regardless of the number of microstates of the system.

The point is not if environmental decoherence is a complete theory but, merely, if exists an intrinsic decoherence effect in quantum systems granting the emerging of classical behavior. This is a sound scientific question that is looking for an answer. It can be tested with interference or quantum optics experiments or NMR measurements as has already been done.

For Lubos, regarding the “result=eigenvalue” paradigm. This is an historical assumption, not a postulate, see Dirac’s quote of his Principles of QM. As he says it further: “These assumptions are reasonable on account of the eigenvalues of real linear operators being always real numbers”. The postulates of a physical theory in accordance with experimental facts needs stronger ‘physical’ reasons than the ‘weak’ mathematical assumption invoked by Dirac. As such, this “result=eigenvalue” assumption is classically founded but disagrees with experimental evidence of ordinary extended objects (when the location of an object |x_1> is measured, it is hardly never found at the exact position x_1). The correct postulate is “expected result=eigenvalue”, which makes the difference between idealized classical and intuitive quantum reality.

For Marco, if we take the quantum macro-droplet investigated by Couder and Fort, interference effects disappear when the droplet decoheres from the surface wave, i.e. when it is to big or when it is disturbed by an environmental event.

Sorry, link to Dirac’s quote didn’t appear well due to double quote characters in the url. The page I was referring to is at: http://books.google.com/books?id=XehUpGiM6FIC&pg=PA35&vq=“we+now+make+some+assumptions”&source=gbs_search_s&cad=0

decoherence is not “intrinsic”; it is caused by the interactions with the environment. An “intrinsic” decoherence in a closed system would be called “a loss of unitarity” that would violate the basic postulates of quantum mechanics.

One can be paying thousands of dollars for useless experiments to find violations of unitarity but it is very clear that none of these experiments can ever find such a thing – because quantum mechanics that implies that such an “intrinsic decoherence” can’t exist has already been checked accurately enough to exclude such possible new deviations in experiments doable with the same apparata (but focusing on a particular deviation from QM predictions only).

Paying thousands of dollars for an experiment is not a scientific proof that the question justifying the experiment is a good, “sound”, and open question. People often throw a lot of money for complete crap. The validity of unitarity has been questioned, by sensible scientists, only in extreme phenomena in the Planckian regime, and even in this regime, it has been established that there is no violation of unitarity.

quite on the contrary. The “result=eigenvalue” is not a historical assumption but rather a key paramount basis of quantum mechanics whose elimination would make all of modern physics impossible.

It is, on the contrary, your doubts about the validity of quantum mechanics that are purely historical, vaguely philosophical, and psychological in character and that have nothing to do with the actual technical results, advances, and open questions – both theoretical and experimental – in science.

Dirac: Dirac may have been feeling slightly uncertain about the framework when he wrote the textbook you mentioned in 1930 but I assure you that physics is no longer uncertain about these paramount things in 2009 – a progress that often occurs after 79 additional years of hard work. Every single person who thinks that the postulates of quantum mechanics are “weak” pillars of physics in 2009 is a full-fledged crackpot.

Your observation that particle is not observed exactly at point x1, whatever x1 actually is, has nothing to do with the accuracy of the statement that “result=eigenvalue” which is completely precise. It is only about the imperfection of the measuring apparata that actually measure a different observable than the “simple one” that you want them ideally to measure. But it is still true that the measurements correspond to observables, Hermitean linear operators, and the results one can obtain are always eigenvalues of these operators.

I would not say so as one is applying an approximation anyway and you are reduced to a FAPP situation as for environmental decoherence. You can understand this using statistical mechanics ensemble and the way they recover thermodynamics. You go there studying fluctuations and you see that they are negligible small. You can safely neglect deviation from equilibrium and work with thermodynamics.

Then, look at phase transitions. The above argument no more applies. Fluctuations are extended to all the volume (macroscopic quantum coherence) and you cannot avoid to turn back to statistical mechanics.

Let me state the following. There is a relation between unitary time evolution and density matrix given by . An obvious analogy is seen in quantum phase transitions. But what meaning should be attached to a thermodynamic limit on unitary evolution?

Please don’t alter what I’ve written. Contrary to what you write, I have no single doubt about the validity of Quantum Mechanics and its postulates.

When you say it is only about the imperfection of the measuring apparata that actually measure a different observable than the “simple one” that you want them ideally to measure, you ascertain that the correct postulate for QM is “expected result=eigenvalue” and not an idealized “result=eigenvalue”. This uncertainty in measurement result may come as well from the imperfection of the measuring apparatus as from the imperfection of the measured system (intrinsic indetermination due to its structure). In a quantum measurement process, observer and observed are entangled: it is impossible to conclude whether the uncertainty of the result originates from one or from the other. Choosing for one or for the other is metaphysics.

Dear Marco,

could you please tell me a rational reason why you think that decoherence – and its correct derivation of the classical/quantum boundary – is not settled?

I think it is completely settled – and the only sense in which it is not settled at the sociological level is that there exist many bigots who don’t want to accept the established truth.

Best wishes

Lubos

Hi Lubos,

I think one can easily prove a theorem that shows that a quantum system with N particles, when N is taken to go to infinity, recovers the semiclassical limit (Thomas-Fermi approximation). This is true independently on the presence of an external agent. But the question is not if environmental decoherence is true, it is true because experiments say so, but if it is all one needs to explain quantum-classical transition.

The point is: FAPP the cat inside the Schroedinger box can be considered classical? If some kind of intrinsic effect exists that does the transition the answer is yes and the ambiguity removed. What is not been settled is if such an effect does exist. There are some evidences in NMR experiments that are quite compelling. Anyhow, a first step would be to recognize that there is a scientific problem here to give an answer.

Best

Marco

Dear Marco,

the reason why your assertion can’t be “easily proved” is that the assertion is untrue.

It is simply not true that an “infinite number of particles”, whatever is the definition of a particle in a general system, is enough for classical description to capture all the key dynamics of the system.

A cool enough material may have trillions of atoms but its states are still classified in terms of fully coherent quantum phonons.

Decoherence actually calculates exactly what is the required limit or inequality in which quantum coherence disappears and the classical framework becomes legitimate. It is not true that the inequality is “N is greater than one”. The point where quantum phenomena disappear depends on the temperature, the strength of interactions, the speed of processes, the geometric shape of objects, the strength of the interactions with the environment, the amount of stuff in the environment, and the extent to which locality is respected by the interactions, among other things.

It is a complex dynamical question. This classical/quantum boundary depends on the Hamiltonian – it’s the whole point of decoherence – and you are completely missing every part of this point you can conceivably miss.

Concerning completeness, how could possibly a complete theory that tells you the location of the quantum/classical crossover for every situation in every physical system one can write down – how could it be incomplete? What is exactly missing? The statement that it’s incomplete makes no sense whatsoever.

The question whether a cat can be considered “classical” is not sufficiently well-defined. Some aspects are well enough described by classical concepts, others need quantum mechanics – like the stability of atoms in the cat and the absorption spectrum of the cat or whatever.

What makes sense in physics is to ask what are the probabilities that something occurs with the cat. For certain simple macroscopic processes with the cat, classical (statistical) physics agrees with quantum mechanics. In the realm where quantum interference, classical approximations are inapplicable.

I have no idea what you’re saying about NMR. All actual observable questions about NMR are fully answered by quantum mechanics, and they have been understood for decades (and their principle for 80 years). There is no open problem here. What you seem to be unwilling to accept is the very fact that quantum mechanics is actually true and the right description for *every* system, whether it is large or small, and classical physics is always just an approximation that holds in certain regimes but not others.

Best wishes

Lubos

Otherwise, the Thomas-Fermi approximation is just a trivial, nearly tautological approximation assuming that the density of levels is large, which can be easily seen for conventional many-body systems, so that the continuum is a fine description. It’s explained e.g. here

http://en.wikipedia.org/wiki/Gas_in_a_box

By the way, I would recommend you the 800+cits 1991 paper about decoherence by Zurek – an updated version is here:

http://arxiv.org/ftp/quant-ph/papers/0306/0306072.pdf

Believe me that it is way more sensible a reference than all the hyperlinks you have included, and it is much wiser to study the words of the people who actually understand something rather than to impress yourself with ignorance promoted to a virtue.

Also, a general great paper about interpretations of QM is one by Omnes:

http://www.slac.stanford.edu/spires/find/hep/www?rawcmd=find+a+omnes+and+title+consistent

Unfortunately, I don’t see the full text online.

Dear Lubos,

The following theorem by Lieb and Simon does hold:

“The limit of number of particles N going

to infinity for the energy functional of N particles interacting with Coulombic forces is the Thomas-Fermi model”

Check this: E. Lieb, Rev. Mod. Phys. 53, 603-641 (1981), pag. 620ff. Thomas-Fermi approximation is a semiclassical approximation. You should combine both these results. These facts are well-known to nuclear physicists.

What is the question about macroscopic coherent states? These states can be seen to evolve following Wigner-Poisson set of equations that have as a limit Thomas-Fermi dynamical equations. So, you can study the stability of the above result. You will find that this will depend on the way you prepare your system. Classical states appear as the most generic ones and they are stable. As you can see is all well coded.

Of course I am a followers of environmental decoherence simply because is true. But I would like to understand if it is truly the last word about this matter. If you are able to give an answer to this messy situation, you should write down a paper and send it to PRA because there are a lot of physicists awaiting for the final answer on these questions. You know better than the me that faith by its own cannot be enough to satisfy our needs.

Best wishes,

Marco

Quoting Ballentine:

“Schlosshauer raises some important questions about the emergence of classical behavior from quantum mechanics (QM) and the appropriate criterion for effective classicality. These are related to the broader issues of the meaning and interpretation of the state vector in QM, and to the narrower issue of the role of decoherence in these problems.”Attempts to interpret the QM state vector generally infer that an eigenvector state yields only one measurement result. For example, state vector |c_2> (cat dead) means that a measurement on the cat will yield a dead cat result. Such situations are valid only in idealized (classical) situations where the elementary particles are classically point-like. Reality is different, when I measure the position of an extended particle |x_1>, the result will be approximatively x_1 but hardly never exactly x_1.

So in order to interpret correctly quantum mechanics, I think that one has to leave aside classical ‘result=eigenvalue’ attempts and focus instead on analogies where macroscopic objects can be treated quantum-mechanically, like rotating needles or rods. Classical behavior than emerges from statistical ensembles of such objects.

Dear Marco,

concerning your question about completeness, is there an operational scientific – non-sociological – way to formulate your question? What would happen if the description of the emergence of the classical world were found “incomplete”?

Would it allow one to calculate something new, at least in principle, i.e. some results XY from some initial data AB? Because I don’t see any yet non-falsified possible addition to the scheme, it is complete according to my taste. Now, no theory in physics can be proved a complete theory of everything, but merely insisting that something is incomplete, without knowing what could be missing, is not a constructive approach, and the fact that it is applied so eagerly in this case and not others is a proof of bias of the dissatisfied people.

For Arjen: quite on the contrary. In Schrödinger cat’s thought experiments, one talks about two states for the sake of simplicity. There are many “classical” states that a cat can be found in: they are labeled by many local “classical” degrees of freedom. But the conclusion is unchanged from the 2-state toy model: almost all of the linear superpositions are “unphysical” as final detectable states.

I said “quite on the contrary” because the “result=eigenvalue” paradigm is a completely general and valid point about the real world because it is really a postulate of quantum mechanics and the latter is universally valid. For example, if you ask a Yes/No question, in the real world, it is always represented by a projection operators (with eigenvalues 0/1, meaning No/Yes), and QM always predicts the probabilities of Yes and No, regardless of the number of microstates of the system.

Best wishes

Lubos

Hi Lubos,

The point is not if environmental decoherence is a complete theory but, merely, if exists an intrinsic decoherence effect in quantum systems granting the emerging of classical behavior. This is a sound scientific question that is looking for an answer. It can be tested with interference or quantum optics experiments or NMR measurements as has already been done.

Best wishes,

Marco

For Lubos, regarding the “result=eigenvalue” paradigm. This is an historical assumption, not a postulate, see Dirac’s quote of his Principles of QM. As he says it further: “These assumptions are reasonable on account of the eigenvalues of real linear operators being always real numbers”. The postulates of a physical theory in accordance with experimental facts needs stronger ‘physical’ reasons than the ‘weak’ mathematical assumption invoked by Dirac. As such, this “result=eigenvalue” assumption is classically founded but disagrees with experimental evidence of ordinary extended objects (when the location of an object |x_1> is measured, it is hardly never found at the exact position x_1). The correct postulate is “expected result=eigenvalue”, which makes the difference between

idealized classicalandintuitive quantumreality.For Marco, if we take the quantum macro-droplet investigated by Couder and Fort, interference effects disappear when the droplet decoheres from the surface wave, i.e. when it is to big or when it is disturbed by an environmental event.

Kind regards,

Arjen

Sorry, link to Dirac’s quote didn’t appear well due to double quote characters in the url. The page I was referring to is at: http://books.google.com/books?id=XehUpGiM6FIC&pg=PA35&vq=“we+now+make+some+assumptions”&source=gbs_search_s&cad=0

Dear Arjen,

Thank you for the ref I was not aware of. NMR experiments were performed by Krojanski and Suter:

http://scitation.aip.org/getabs/servlet/GetabsServlet?prog=normal&id=PLRAAN000074000006062319000001&idtype=cvips&gifs=yes

Pioneering works by Pastawski’s group should also be acknowledged.

Marco

Dear Marco,

decoherence is not “intrinsic”; it is caused by the interactions with the environment. An “intrinsic” decoherence in a closed system would be called “a loss of unitarity” that would violate the basic postulates of quantum mechanics.

One can be paying thousands of dollars for useless experiments to find violations of unitarity but it is very clear that none of these experiments can ever find such a thing – because quantum mechanics that implies that such an “intrinsic decoherence” can’t exist has already been checked accurately enough to exclude such possible new deviations in experiments doable with the same apparata (but focusing on a particular deviation from QM predictions only).

Paying thousands of dollars for an experiment is not a scientific proof that the question justifying the experiment is a good, “sound”, and open question. People often throw a lot of money for complete crap. The validity of unitarity has been questioned, by sensible scientists, only in extreme phenomena in the Planckian regime, and even in this regime, it has been established that there is no violation of unitarity.

Best wishes

Lubos

Dear Arjen,

quite on the contrary. The “result=eigenvalue” is not a historical assumption but rather a key paramount basis of quantum mechanics whose elimination would make all of modern physics impossible.

It is, on the contrary, your doubts about the validity of quantum mechanics that are purely historical, vaguely philosophical, and psychological in character and that have nothing to do with the actual technical results, advances, and open questions – both theoretical and experimental – in science.

Dirac: Dirac may have been feeling slightly uncertain about the framework when he wrote the textbook you mentioned in 1930 but I assure you that physics is no longer uncertain about these paramount things in 2009 – a progress that often occurs after 79 additional years of hard work. Every single person who thinks that the postulates of quantum mechanics are “weak” pillars of physics in 2009 is a full-fledged crackpot.

Your observation that particle is not observed exactly at point x1, whatever x1 actually is, has nothing to do with the accuracy of the statement that “result=eigenvalue” which is completely precise. It is only about the imperfection of the measuring apparata that actually measure a different observable than the “simple one” that you want them ideally to measure. But it is still true that the measurements correspond to observables, Hermitean linear operators, and the results one can obtain are always eigenvalues of these operators.

All the best

Lubos

Dear Lubos,

I would not say so as one is applying an approximation anyway and you are reduced to a FAPP situation as for environmental decoherence. You can understand this using statistical mechanics ensemble and the way they recover thermodynamics. You go there studying fluctuations and you see that they are negligible small. You can safely neglect deviation from equilibrium and work with thermodynamics.

Then, look at phase transitions. The above argument no more applies. Fluctuations are extended to all the volume (macroscopic quantum coherence) and you cannot avoid to turn back to statistical mechanics.

Let me state the following. There is a relation between unitary time evolution and density matrix given by . An obvious analogy is seen in quantum phase transitions. But what meaning should be attached to a thermodynamic limit on unitary evolution?

Best wishes,

Marco

Dear Lubos,

Please don’t alter what I’ve written. Contrary to what you write, I have no single doubt about the validity of Quantum Mechanics and its postulates.

When you say

it is only about the imperfection of the measuring apparata that actually measure a different observable than the “simple one” that you want them ideally to measure, you ascertain that the correct postulate for QM is “expected result=eigenvalue” and not an idealized “result=eigenvalue”. This uncertainty in measurement result may come as well from the imperfection of the measuring apparatus as from the imperfection of the measured system (intrinsic indetermination due to its structure). In a quantum measurement process, observer and observed are entangled: it is impossible to conclude whether the uncertainty of the result originates from one or from the other. Choosing for one or for the other ismetaphysics.Kind regards,

Arjen

[…] rubbish. For example the high priest of the Ensemble Interpretation Ballentine thinks its rubbish: https://marcofrasca.wordpress.com/2009/03/10/ballentine-and-the-decoherence-program/ This is science – if it isn’t what you like then you can view it anyway you want as long as […]