I tried in different ways to get this paper through the community with standard channels. As far as I can tell, this paper is unpublishable. By this I mean that journals not even send it to referees to start a normal review process or all people try to stop it from making it known. The argument is always the same: A reformulation of quantum mechanics using stochastic processes but using noncommutative geometry this time. I apologize to the community if this unacceptable approach has bothered people around the World but this is the fate of some ideas. Of course, if somebody has the courage and the willing to publish, let me know and I will appreciate the tentative with infinite gratefulness.

Now, back to sane QCD.

Happy new year!

Fooling with mathematicians



I am still working with stochastic processes and, as my readers know, I have proposed a new view of quantum mechanics assuming that at the square root of a Wiener process can be attached a meaning (see here and here). I was able to generate it through a numerical code. A square root of a number can always be taken, irrespective of any deep and beautiful mathematical analysis. The reason is that this is something really new and deserves a different approach much in the same way it happened to the Dirac’s delta that initially met with skepticism from the mathematical community (simply it did not make sense with the knowledge of the time). Here I give you some Matlab code if you want to try by yourselves:

nstep = 500000;
dt = 50;
B = normrnd(0,sqrt(dt/nstep),1,nstep);
dB = cumsum(B);
% Square root of the Brownian motion

Nothing can prevent you from taking the square root of  a number as is a Brownian displacement and so all this has a well sound meaning numerically. The point is just to understand how to give this a full mathematical meaning. The wrong approach in this case is just to throw all away claiming all this does not exist. This is exactly the behavior I met from Didier Piau. Of course, Didier is a good mathematician but simply refuses to accept the possibility that such concepts can have a meaning at all based on what has been so far coded in the area of stochastic processes. This notwithstanding that they can be easily computed on your personal computer at home.

But this saga is not over yet. This time I was trying to compute the cubic root of a Wiener process and I posted this at Mathematics Stackexchange. I put this question with  the simple idea in mind to consider a stochastic process with a random mean and I did not realize that I was provoking a small crisis again. This time the question is the existence of the process {\rm sign}(dW). Didier Piau immediately wrote down that it does not exist. Again I give here the Matlab code that computes it very easily:

nstep = 500000;
dt = 50;
B = normrnd(0,sqrt(dt/nstep),1,nstep);
dB = cumsum(B);
% Sign and absolute value of a Wiener process
dS = sign(dB);
dA = dB./dS;

Didier Piau and a colleague of him just complain on the Matlab way the sign operation is performed. My view is that it is all legal as Matlab takes + or – depending on the sign of the displacement, a thing that can be made by hand and that does not imply anything exotic.  What it is exotic here it the strong opposition this evidence meets notwithstanding is easily understandable by everybody and, of course, easily computable on a tabletop computer. The expected distribution for the signs of Brownian displacements is a Bernoulli with p=1/2. Here is the histogram from the above code

Histogram sign(dW)This has mean 0 and variance 1 as it should for N=\pm 1 and p=\frac{1}{2} but this can be verified after some Montecarlo runs. This is in agreement with what I discussed here at Mathematics Stackexchange as a displacement in a Brownian motion is a physics increment or decrement of the moving particle and has a sign that can be managed statistically. My attempt to compare all this to the case of Dirac’s delta turns out into a complain of overstatement as delta was really useful and my approach is not (but when Dirac put forward his idea this was just airy-fairy for the time). Of course, a reformulation of quantum mechanics would be a rather formidable support to all this but this mathematician does not seem to realize it.

So, in the end, I am somewhat surprised by the behavior of the community against novelties. I can understand skepticism, it belongs to our profession, but for facing new concepts that can be easily checked numerically to exist I would prefer a more constructive behavior trying to understand rather than an immediate dismissal. It appears like history of science never taught anything leaving us with a boring repetition of stereotyped reactions to something that instead would be worthwhile further consideration. Meanwhile, I hope my readers will enjoy playing around with these new computations using some exotic mathematical operations on a stochastic process.

Marco Frasca (2012). Quantum mechanics is the square root of a stochastic process arXiv arXiv: 1201.5091v2

Quantum mechanics and the square root of Brownian motion



There is a very good reason why I was silent in the past days. The reason is that I was involved in one of the most difficult article to write down since I do research (and are more than twenty years now!).  This paper arose during a very successful collaboration with two colleagues of mine: Alfonso Farina and Matteo Sedehi. Alfonso is a recognized worldwide authority in radar technology and last year has got a paper published here about the ubiquitous Tartaglia-Pascal triangle and its applications in several areas of mathematics and engineering. What was making Alfonso unsatisfied was the way the question of Tartaglia-Pascal triangle fits quantum mechanics. It appeared like this is somewhat an unsettled matter. Tartaglia-Pascal triangle gives, in the proper limit, the solution of the heat equation typical of Brownian motion, the most fundamental of all stochastic processes. But when one comes to the Schroedinger equation, notwithstanding the formal resemblance between these two equations, the presence of the imaginary term changes things dramatically. So, a wave packet of a free particle is seen to spread like the square of time rather than linearly. Then, Alfonso asked to me to try to clarify the situation and see what is the role of Tartaglia-Pascal triangle in quantum mechanics. This question is old almost as quantum mechanics itself. Several people tried to explain the probabilistic nature of quantum mechanics through some kind of Brownian motion of space and the most famous of these attempts is due to Edward Nelson. Nelson was able to show that there exists a stochastic process producing hydrodynamic equations from which the Schroedinger equation can be derived. This idea turns out to be a description of quantum mechanics similar to the way David Bohm devised it. So, this approach was exposed to criticisms that can be summed up in a paper by Peter Hänggi, Hermann Grabert and Peter Talkner (see here) denying any possible representation of quantum mechanics as a classical stochastic process.

So, it is clear that the situation appears rather difficult to clarify with such notable works. With Alfonso and Matteo, we have had several discussions and the conclusion was striking: Tartaglia-Pascal triangle appears in quantum mechanics rather with its square root! It appeared like quantum mechanics is not itself a classical stochastic process but the square root of it. This could explain why several excellent people could have escaped the link.

At this point, it became quite difficult to clarify the question of what a square root of a stochastic process as Brownian motion should be. There is nothing in literature and so I tried to ask to trained mathematicians to see if something in advanced research was known (see here). MathOverflow is a forum of discussion for advanced research managed by the community of mathematicians. It met a very great success and this is testified by the fact that practically all the most famous mathematicians give regular contributions to it. Posting my question resulted in a couple of favorable comments that informed me that this question was not known to have an answer. So, I spent a lot of time trying to clarify this idea using a lot of very good books that are available about stochastic processes. So, last few days I was able to get a finite answer: The square of Brownian motion is computable in a standard way with Itō integral reducing to a Brownian motion multiplied by a Bernoulli process. The striking fact is that the Bernoulli process is that of tossing a coin! The imaginary factor emerges naturally out of this mathematical procedure and now the diffusion equation is the Schroedinger equation. The identification of the Bernoulli process came out thanks to the help of Oleksandr Pavlyk after I asking this question at MathStackexchange. This forum is also for well-trained mathematicians but the kind of questions one can put there can also be at a student level. Oleksandr’s answer was instrumental for a complete understanding of what I was doing.

Finally, I decided to verify with the community of mathematicians if all this was nonsense or not and I posted again on MathStackexchange a derivation of the square root of a stochastic process (see here).  But, with my great surprise, I discovered that some concepts I used for the Itō calculus were not understandable at all. I gave them for granted but these were not defined in literature! So, after some discussions, I added important clarifications there and in my paper making clear what I was doing from a mathematical standpoint. Now, you can find all this in my article. Itō calculus must be extended to include all the ideas I was exploiting.

The link between quantum mechanics and stochastic processes is a fundamental one. The reason is that, if one get such a link, an understanding of the fundamental behavior of space-time is obtained. This appears a fluctuating entity but in an unexpected way. This entails a new reformulation of quantum mechanics with the language of stochastic processes. Given this link, any future theory of quantum gravity should recover it.

I take this chance to give publicly a great thank to all these people that helped me to reach this important understanding and that I have cited here. Also mathematicians that appeared anonymously were extremely useful to improve my work. Thank you very much, folks!

Update: After an interesting discussion here with Didier Piau and George Lowther, we reached the conclusion that the definitions I give in my paper to extend the definition of the Ito integral are not mathematically consistent. Rather, when one performs the corresponding Riemann sums one gets diverging results for the interesting values of the exponent 0<\alpha<1 and the absolute value. Presently, I cannot see any way to get a sensible definition for this and so this paper should be considered mathematically not consistent. Of course, the idea of quantum mechanics as the square root of a stochastic process is there to stay and to be eventually verified, possibly with different approaches and better mathematics.

Further update:  I have posted a revised version of the paper with a proper definition of this generalized class of Ito integrals (see here).

Marco Frasca (2012). Quantum mechanics is the square root of a stochastic process arXiv arXiv: 1201.5091v1

Farina, A., Giompapa, S., Graziano, A., Liburdi, A., Ravanelli, M., & Zirilli, F. (2011). Tartaglia-Pascal’s triangle: a historical perspective with applications Signal, Image and Video Processing DOI: 10.1007/s11760-011-0228-6

Grabert, H., Hänggi, P., & Talkner, P. (1979). Is quantum mechanics equivalent to a classical stochastic process? Physical Review A, 19 (6), 2440-2445 DOI: 10.1103/PhysRevA.19.2440

The question of the arrow of time


A recent paper by Lorenzo Maccone on Physical Review Letters (see here) has produced some fuss around. He tries to solve the question of the arrow of time from a quantum standpoint. Lorenzo is currently a visiting researcher at MIT and, together with Vittorio Giovannetti and Seth Lloyd, he produced several important works in the area of quantum mechanics and its foundations. I have had the luck to meet him in a conference at Gargnano on the Garda lake together with Vittorio. So, it is not a surprise to see this paper of him in an attempt to solve one of the outstanding problems of physics.

The question of the arrow of time is open yet. Indeed, one can think that Boltzmann’s H-theorem closed this question definitely but this is false. This theorem has been the starting point for a question yet to be settled. Indeed, Boltzmann presented a first version of his theorem that showed one of the most beautiful laws in physics: the relation between entropy and probability. This proof was criticized by Loschmidt (see here) and this criticism was sound. Indeed, Boltzmann had to modifiy his proof by introducing the so called Stosszahlansatz or molecular chaos hypothesis introducing in this way time asymmetry by hand.  Of course, we know for certain that this theorem is true and so, also the hypothesis of molecular chaos must be true. So, the question of the arrow of time will be solved only when we will know where molecular chaos comes from. This means that we need a mechanism, a quantum one, to explain Boltzmann’s hypothesis. It is important to emphasize that, till today, a proof does not exist of the H-theorem that removes such an assumption.

Quantum mechanics is the answer to this situation and this can be so if we knew how reality forms. An important role in this direction could be given by environmental decoherence and how it relates to the question of the collapse. A collapse grants immediately asymmetry in time and here one has to cope with many-body physics with a very large number of components. In this respect there exists a beautiful theorem by Elliot Lieb and Barry Simon, two of the most prominent living mathematical-physicists, that says:

Thomas-Fermi model is the limit of quantum theory when the number of particles goes to infinity.

For a more precise statement you can look at Review of Modern Physics page 620ff. Thomas-Fermi model is just a semi-classical model and this just means that this fundamental theorem can be simply restated as saying that the limit of a very large number of particles in quantum mechanics is the classical world. In some way, there exists a large number of Hamiltonians in quantum mechanics that are not stable with  respect to such a particle limit losing quantum coherence. For certain we know that there exist other situations where quantum coherence is kept at a large extent in many-body systems. This would mean that exist situations where quantum fluctuations are not damped out with increasing number of particles.  But the very existence of this effect implied in the Lieb and Simon theorem means that quantum mechanics has an internal mechanism producing time-asymmetry. This, together with environmental decoherence (e.g. the box containing a gas is classical and so on), should grant a fully understanding of the situation at hand.

Finally, we can say that Maccone’s attempt, being on this line of thought, is a genuine way to understand from quantum mechanics the origin of time-asymmetry. I hope his ideas will meet with luck.

Update: In Cosmic Variance you will find an interesting post and worthwhile to read discussion involving Sean Carroll, Lorenzo Maccone and others on the questions opened with Lorenzo’s paper.

Quantum mechanics and gravity


Reading the daily by arxiv today I cannot overlook a quite interesting paper that will appear soon on Physical Review Letters. This paper (see here), written by Saurya Das and Elias Vagenas, presents some relevant conclusions about the effects of gravity in quite common quantum mechanical systems. The authors rely their conclusions on an acquired result, due mostly to string theory, that a fundamental length must exist and this fundamental length modifies in a well defined way the indeterminacy principle. So, one can quantify this effect on whatever quantum mechanical system through a correcting Hamiltonian term and evaluating the effect of gravity on this system. In this way one can obtain an estimation on how relevant is the effect and how far can be an experimental measurement of this. The conclusions the authors reached are quite interesting. Of course, all of the cases imply a too small effect to be in the reach of a laboratory observation but, the most not trivial conclusion is that could exist an intermediate fundamental length that could be observed e.g. at LHC. This intermediate length should be placed between the electroweak and the Planck scale.

It is the first time that I see such estimations on quite simple quantum mechanical models and I would expect more extended analysis on a similar line. Surely, it would be striking to see in laboratory such a tiny effect correcting the Lamb shift. But, working in quantum optics, I learned that progress experimentalists are able to put out can be very impressive in a very short time. So, I would not be surprised if in some years Physical Review Letters should publish some experimental letter about this matter being the first evidence of a quantum gravity effect in a laboratory.

%d bloggers like this: