## Return in Paris

15/06/2013

After two years since the last edition, I was back in Paris to participate to the Twelfth Workshop on Non-perturbative Quantum Chromodynamics. The conference is organized by high-energy group at Brown University and held at Institut d’Astrophysique de ParisProfessor Chung-I Tan and Professor Berndt Mueller from Duke University are the organizers. As it also happened in the precedent edition, the workshop was really interesting and rich of ideas for research. The first talk was given by Kostantinos Orginos and was about nuclear physics emerging from lattice computations. This is a matter that I am involved into as a “final user” and so, very near my interests. It is noteworthy to point out how current technology permits  to extract such results from lattice QCD making this a useful tool for the understanding of low-energy phenomenology. With Kostantinos,  his wife Vassiliki Panoussi and sons, we have had a nice social dinner on Tuesday night and I have had an interesting discussion about the current situation of lattice computations. The next speaker was Philippe de Forcrand that is well-known for his works on finite temperature QCD on the lattice.   He showed how the effective Yang-Mills theory at high temperature is surprisingly good with respect to lattice results also lowering temperatures at few times the critical temperature. Another interesting talk was the one by Peter Petreczy about the observables of QCD at finite temperature presenting also the most recent value for the critical temperature. As my readers may know, I computed this value in my recent paper on Physical Review C (see here) properly corrected by the mass gap of Yang-Mills theory. Norberto Scoccola and Daniel Gomez-Dumm showed similar results (see here).

On Tuesday it was the ultrarelativistic Heavy-ion collision session. This was particularly interesting and involved the talks of two friends of mine: Marco Ruggieri and Salvatore Plumari. In this area of research there is a really interesting and hot debated situation. On the other side, there is plenty of experimental results from RHIC and LHC. The session chair was Jean-Yves Ollitrault. He put the foundations to the current understanding of the quark-gluon plasma through a hydrodynamic approximation. What is observed in the experiments is the production of a flow of particles in a transverse direction named elliptic flow. This is a clear evidence of existence for the quark-gluon plasma. Marco and Salvatore work in the group of Vincenzo Greco at University of Catania in Italy. The idea they based their work on is to derive the hydrodynamic equations from a kinetic description as the one provided by the Boltzmann equation. This approach opens up the scene to the possibility to derive such an equation and the full description of the quark-gluon plasma starting directly from QCD and fixing the collisional integral of the kinetic equation. Of course, one should understand the applicability conditions but my take is that, being the running coupling going to zero due to asymptotic freedom, a quark-gluon plasma should have scarce multi-collision effects. On the other side, this is a charged plasma but lives for a very small time. This means that this approach can prove to be really successful. One of the open questions is if, going at higher energies, a state called “color glass condensate” should form and this is a matter of a hot debate in the community. This is creating some tension that is reminiscent of the story I recounted about Landau gauge propagators for pure Yang-Mills theory (see here). A color glass condensate gives an increasing lower bound on the viscosity to entropy ratio by a factor 2 with respect to $1/4\pi$, also computed from string theory, and appears less efficient with respect to observed elliptic flow at RHIC (see here). This kind of wars is often unproductive in physics and science at large as it slows down progress and good works could turn out unpublished. In situations like this, researchers should have eyes wide open and open minds granting all the contenders to be fairly listened waiting for experiments or careful lattice computations to say the last word. This should teach the history of Landau gauge propagators and also by looking back to history of physics. Otherwise we will stay on a silly forever war  where we are only able to prove to the rest of mankind that nothing has been learned from the past.

On Wednesday the session was dedicated to AdS/CFT, Holography, and Scattering. There was the talk of Carl Bender that is currently working on PT quantum mechanics. He is the pioneer of strong perturbation for quantum systems and quantum field theory. I often cited his work that has been a source of inspiration. David Dudal also spoke and discussed a holographic model for the analysis of strong ion collisions and the effect of the huge magnetic field generated. He gets results reminiscent of the Nambu-Jona-Lasinio model.  David is one of the proponents of the Refined Gribov-Zwanzinger model (see here). This is a real successful approach to the understanding of Landau gauge propagators and fits quite well with my results in the deep infrared behavior of a Yang-Mills theory as I also pointed out in my talk (see below).

It was a great workshop and I have been very happy to be there also this year. I hope people at Brown University will repeat this again. Thanks a lot!

Marco Frasca (2011). Chiral symmetry in the low-energy limit of QCD at finite temperature Phys. Rev. C 84, 055208 (2011) arXiv: 1105.5274v4

D. Gomez Dumm, & N. N. Scoccola (2004). Characteristics of the chiral phase transition in nonlocal quark models Phys.Rev. C72 (2005) 014909 arXiv: hep-ph/0410262v2

Ollitrault, J. (1992). Anisotropy as a signature of transverse collective flow Physical Review D, 46 (1), 229-245 DOI: 10.1103/PhysRevD.46.229

M. Ruggieri, F. Scardina, S. Plumari, & V. Greco (2013). Elliptic Flow from Nonequilibrium Color Glass Condensate Initial
Conditions arXiv arXiv: 1303.3178v1

David Dudal, John Gracey, Silvio Paolo Sorella, Nele Vandersickel, & Henri Verschelde (2008). A refinement of the Gribov-Zwanziger approach in the Landau gauge:
infrared propagators in harmony with the lattice results Phys.Rev.D78:065047,2008 arXiv: 0806.4348v2

Lieb, E., & Simon, B. (1973). Thomas-Fermi Theory Revisited Physical Review Letters, 31 (11), 681-683 DOI: 10.1103/PhysRevLett.31.681

Lieb, E., & Simon, B. (1977). The Thomas-Fermi theory of atoms, molecules and solids Advances in Mathematics, 23 (1), 22-116 DOI: 10.1016/0001-8708(77)90108-6

Marco Frasca (2006). Thermodynamic Limit and Decoherence: Rigorous Results Journal of Physics: Conference Series 67 (2007) 012026 arXiv: quant-ph/0611024v1

## Higgs and beyond

06/06/2013

I am writing these few lines while the conference “Higgs and beyond” is still going on at Tohoku University (Sendai) in Japan. Talks can be found here. Both ATLAS and CMS presented a lot of results about Higgs particle and the most relevant of them is the combination of the data from the two experiments (see here). I am following the excellent recount by Richard Ruiz on twitter (@bravelittlemuon) that also takes care of CERN’s blog. Some interesting point is that there seems to be a bump in $Z\gamma$ channel that is persistent also in other channels. About decay rates, improvements confirm yet nearly Standard Model behavior of the Higgs particle but with the rates of WW and ZZ going down with a too large error bars yet (see my preceding post).  Hopes are that CMS and ATLAS could combine also these data reducing error bars. No other Standard Model heavy Higgs particle is seen. Both CMS and ATLAS are looking for evidence of more Higgs particles to no avail yet. Of course, my view is that these excitations should be searched with somewhat different rates from Standard Model expectations. In any case, Standard Model confirms itself as one of the most successful theories in the history of physics. As said by one of ATLAS speakers: “There is overwhelming evidence for a new boson; there is overwhelming evidence for nothing else.” Both experiments plan to complete the analysis of data at 8 TeV for the summer conferences. My personal expectations are that just improvements in the precision of the measurements of the decay rates could eventually give hints of new physics. To fulfill other hopes, we need LHC upgrade that will restart operations on the spring of 2015, hopefully.

## Intrinsic decoherence observed again!

25/05/2013

Decoherence is the effect that causes a quantum system to behave classically. The most known of this kind of effects is due to environment where the interaction of an open quantum system with its surrounding is the reason for the loss of quantum coherence. This effect is well-proven on an experimental ground and must be considered acquired knowledge. On the other side, it is a correct scientific question to ask if a closed quantum system ever displays classical behavior for some reason. I have already put forward my take in this blog (see here). This week, on Physical Review Letters (see here and here), it is appeared a paper showing how intrinsic decoherence comes out in an experimental setup of two coupled kicked rotors. Kicked rotors are the epitome of studies on classical chaos and corresponding quantum behavior. It is known that, classically, such a system display diffusion above a certain threshold, firstly computed by Boris Chirikov. The corresponding quantum system localizes instead when its classical counterpart is chaotic. This is the hallmark of a proper quantum behavior that refrains from chaos proper to classical nonlinear systems. The main reason is that the Schrödinger equation is just linear and superposition principle applies. On 1988, S. Adachi, M. Toda, and K. Ikeda showed a real beautiful result that two of such coupled systems lose quantum coherence (see here). The paper by Bryce Gadway, Jeremy Reeves, Ludwig Krinner, and Dominik Schneble (see here) is an experimental proof of the fact that the original theoretical result is a correct insight and we have again a proof that environmental decoherence is not all the story. An interesting recount is given here. This paper is really striking and open the door to a new class of experiments where closed quantum systems, possibly with a lot of systems involved, will be studied to give a full understanding of the quantum-classical transition.

Bryce Gadway, Jeremy Reeves, Ludwig Krinner, & Dominik Schneble (2012). Evidence for a Quantum-to-Classical Transition in a Pair of Coupled
Quantum Rotors Phys. Rev. Lett. 110, 190401 (2013) arXiv: 1203.3177v2

Adachi, S., Toda, M., & Ikeda, K. (1988). Quantum-Classical Correspondence in Many-Dimensional Quantum Chaos Physical Review Letters, 61 (6), 659-661 DOI: 10.1103/PhysRevLett.61.659

## CMS harbors new physics beyond the Standard Model

17/05/2013

In these days is ongoing LHCP 2013 (First Large Hadron Collider Physics Conference) and CMS data seem to point significantly toward new physics. Their measurements on the production modes for WW and ZZ are agreeing with my recent computations (see here) and overall are deviating slightly from Standard Model expectations giving

$\frac{\sigma}{\sigma_SM}=0.80\pm 0.14$

Note that Standard Model is alive and kicking yet but looking at the production mode of WW you will read

$\frac{\sigma_{WW}}{\sigma_{WW\ SM}}=0.68\pm 0.20$

in close agreement with results given in my paper and improved respect to Moriond that was $0.71\pm 0.21$. The reason could be that: Higgs model is a conformal one. Data from ZZ yield

$\frac{\sigma_{ZZ}}{\sigma_{ZZ\ SM}}=0.92\pm 0.28$

that is consistent with the result for WW mode, though. I give here the full table from the talk

For the sake of completeness I give here also the same results from ATLAS at the same conference that, instead, seems to go the other way round obtaining overall $1.30\pm 0.20$ and this is already an interesting matter.

At CMS, new physics beyond the Standard Model is peeping out and, more inteestingly, the Higgs model tends to be a conformal one. If this is true, supersymmetry is an inescapable consequence (see here). I would like to conclude citing the papers of other people working on this model and that will be largely cited in the foreseeable future (see here and here).

Marco Frasca (2013). Revisiting the Higgs sector of the Standard Model arXiv arXiv: 1303.3158v1

Marco Frasca (2010). Mass generation and supersymmetry arXiv arXiv: 1007.5275v2

T. G. Steele, & Zhi-Wei Wang (2013). Is Radiative Electroweak Symmetry Breaking Consistent with a 125 GeV
Higgs Mass? Physical Review Letters 110, 151601 arXiv: 1209.5416v3

Krzysztof A. Meissner, & Hermann Nicolai (2006). Conformal Symmetry and the Standard Model Phys.Lett.B648:312-317,2007 arXiv: hep-th/0612165v4

## Tartaglia-Pascal triangle and quantum mechanics

26/04/2013

The paper I wrote with Alfonso Farina and Matteo Sedehi about the link between the Tartaglia-Pascal triangle and quantum mechanics is now online (see here). This paper contains as a statement my theorem that provides a connection between the square root of a Wiener process and the Schrödinger equation that arose a lot of interest and much criticisms by some mathematicians (see here). So, it is worthwhile to tell how all this come about.

On fall 2011, Alfonso Farina called me as he had an open problem after he and his colleagues got published a paper on Signal, Image and Video Processing, a journal from Springer, where it was shown how the Tartaglia-Pascal triangle is deeply connected with diffusion and the Fourier equation. The connection comes out from the binomial coefficients, the elements of the Tartaglia-Pascal triangle, that in some limit give a Gaussian and this Gaussian, in the continuum, is the solution of the Fourier equation of heat diffusion. This entails a deep connection with stochastic processes. Stochastic processes, for most people working in the area of radar and sensors, are essential to understand how these device measure through filtering theory. But, in the historic perspective Farina & al. put their paper, they were not able to get a proper connection for the Schrödinger equation, notwithstanding they recognized there is a deep formal analogy with the Fourier equation. This was the open question: How to connect Tartaglia-Pascal triangle and Schrödinger equation?

People working in quantum physics are aware of the difficulties researchers have met to link stochastic processes a la Wiener and quantum mechanics. Indeed, skepticism is the main feeling of all of us about this matter. So, the question Alfonso put forward to me was not that easy. But Alfonso & al. paper contains also a possible answer: Just start from discrete and then go back to continuum. So, the analog of the heat equation is the Schrödinger equation for a free particle and its kernel and, indeed, the evolution of a Gaussian wave-packet can be managed on the discrete and gives back the binomial coefficient. What you get in this way are the square root of binomial coefficients. So, the link with the Tartaglia-Pascal triangle is rather subtle in quantum mechanics and enters through a square root, reminiscent of the Dirac’s work and his greatest achievement, Dirac equation. This answered Alfonso’s question and in a way that was somewhat unexpected.

Then, I thought that this connection could be deeper than what we had found. I tried to modify Itō calculus to consider fractional powers of a Wiener process. I posted my paper on arxiv and performed both experimental and numerical computations. All this confirms my theorem that the square root of a Wiener process has as a diffusion equation the Schrödinger equation. You can easily take the square root of a natural noise (I did it) or compute this on your preferred math software. It is just interesting that mathematicians never decided to cope with this and still claim that all this evidence does not exist, basing their claims on a theory that can be easily amended.

We have just thrown a seed in the earth. This is our main work. And we feel sure that very good fruits will come out. Thank you very much Alfonso and Matteo!

Farina, A., Frasca, M., & Sedehi, M. (2013). Solving Schrödinger equation via Tartaglia/Pascal triangle: a possible link between stochastic processing and quantum mechanics Signal, Image and Video Processing DOI: 10.1007/s11760-013-0473-y

Marco Frasca (2012). Quantum mechanics is the square root of a stochastic process arXiv arXiv: 1201.5091v2

Farina, A., Giompapa, S., Graziano, A., Liburdi, A., Ravanelli, M., & Zirilli, F. (2011). Tartaglia-Pascal’s triangle: a historical perspective with applications Signal, Image and Video Processing, 7 (1), 173-188 DOI: 10.1007/s11760-011-0228-6

## Conformal Standard Model is consistent with the observed Higgs particle

12/04/2013

Robert Garisto is an Editor of Physical Review Letters, the flagship journal of American Physical Society and the one with the highest impact factor in physics. I follow him on twitter (@RobertGaristo) and he points out interesting papers that appear in the journal he works in. This time I read the following

and turned immediately my attention to the linked paper: This one (if you have not a subscription you can find it at arxiv) by Tom Steele and Zhi-Wei Wang showing, with the technique of Padè approximants and an average method how to compute the exact mass of Higgs particle from Coleman-Weinberg mechanism arriving to estimate the ninth order contribution. This is so beacuse they need a stronger coupling with respect to the original Higgs mechanism. They reach an upper bound of 141 GeV for the mass and 0.352 for the self-coupling while they get the mass of 124 GeV for a self-coupling of 0.23. This shows unequivocally that the quadratic term, the one generating the hierarchy problem, is absolutely not needed and the Standard Model, in its conformal formulation, is able to predict the mass of the Higgs particle. Besides, the production rates are identical to the original model but differ for the production of Higgs pairs and this is where one could tell which way nature has chosen. This implies that, at the moment, one has no way to be sure this is the right solution but we have to wait till 2015 after LHC upgrade. So, once again, the precise measurements of these decay rates are essential to tell if we are coping with the original Higgs mechanism or something different or if we need two more years to answer this question. In any case, it is possible that Nobel committee has to wait yet before to take a decision. However, in the sixties that formulation was the only possible and any other solution would have been impossible to discover for the lack of knowledge. They did a great job even if we will prove a different mechanism at work as they provided credibility to the Standard Model and people could trust it.

Finally, I would like to note how the value of the coupling is consistent with my recent estimation where I get 0.36 for the self-interaction. I get different production rates and I would be just curious to see how pictures from ATLAS and CMS would change comparing differently from the Standard Model in order to claim no other Higgs-like particle is seen.

What we can conclude is that the conformal Standard Model is in even more better shape than before and just a single Higgs particle would be needed. An astonishing result.

Steele, T., & Wang, Z. (2013). Is Radiative Electroweak Symmetry Breaking Consistent with a 125 GeV Higgs Mass? Physical Review Letters, 110 (15) DOI: 10.1103/PhysRevLett.110.151601

Marco Frasca (2013). Revisiting the Higgs sector of the Standard Model arXiv arXiv: 1303.3158v1

## Much closer to the Standard Model

18/03/2013

Today, the daily from arxiv yields a contribution from John Ellis and Tevong You analyzing new data presented at Aspen and Moriond the last two weeks by CMS and ATLAS about Higgs particle (see here). Their result can be summarized in the following figure

that is really impressive. This means that the updated data coming out from LHC constraints even more the Higgs particle found so far to be the Standard Model one. Another impressive conclusion they are able to draw is that the couplings appear to be proportional to the masses as it should be expected from a well-behaved Higgs particle. But they emphasize that this is “a” Higgs particle and the scenario is well consistent with supersymmetry. Citing them:

The data now impose severe constraints on composite alternatives to the elementary Higgs boson of the Standard Model. However, they do not yet challenge the predictions of supersymmetric models, which typically make predictions much closer to the Standard Model values. We therefore infer that the Higgs coupling measurements, as well as its mass, provide circumstantial support to supersymmetry as opposed to these minimal composite alternatives, though this inference is not conclusive.

They say that further progress on the understanding of this particle could be granted after the upgraded LHC will run and, indeed, nobody is expecting some dramatic change into this scenario from the data at hand.

John Ellis, & Tevong You (2013). Updated Global Analysis of Higgs Couplings arXiv arXiv: 1303.3879v1

## A Higgs particle but which one?

14/03/2013

After Moriond conference last week, and while Moriond QCD and Aspen conferences are running yet, an important conclusion can be drawn and it is the one given in this CERN press release. The particle announced on 4th July last year is for certain a Higgs particle as it has spin 0, positive parity and couples almost like the Standard Model Higgs particle to all others. The agreement with Standard Model is embarrassingly increasing as cumulated data since last year are analyzed. Today, CMS will also update their results for the decay $H\rightarrow\gamma\gamma$ and we will know if the small deviation observed by ATLAS will be confirmed. It is true that they see such a deviation with a larger dataset but, rather to increase, it has slightly diminished and this is not really encouraging.

So far, no other particle has been seen and no new physics beyond the Standard Model is seen at the horizon. There is some people pushing for a conclusive assignment of the nature of this boson to the vanilla Higgs particle postulated in the sixties. But it is really too early yet to draw such a conclusion and I have explained why in a paper of mine appeared today on arxiv (see here). Indeed, a formulation of the Higgs field is possible such that, at the tree level, coincides with the original Higgs field (a Higgs impostor). This is due to the existence of exact solutions of the equations of motion of such a field (see here). The relevant point to tell which one is realized in nature is through the decay rate in WW and ZZ and, with the current data, there is agreement for both yet. But, being amplitudes exponentially damped, higher excited states of the Higgs boson cannot be easily seen presently and their eventual observation appears as a statistical fluctuation yet. This can be evaluated quantitatively. It is important because the ZZ decay is sensible to higher masses and displays some peaks that reveal themselves as statistical fluctuations. Increasing the number of events could turn these peaks into real observations.

The interesting point here is that we are moving form the discovery moment to the study phase with a lot of room for improving measurements on this Higgs particle. But the analysis for the existence of higher excited states, Higgs’ brothers, is just at its infancy.

Update: This the analogous figure from ATLAS while the figure for $H\rightarrow\gamma\gamma$ from CMS agrees quite well with the Standard Model: $0.8\pm 0.3$.

Marco Frasca (2013). Revisiting the Higgs sector of the Standard Model arXiv arXiv: 1303.3158v1

Marco Frasca (2009). Exact solutions of classical scalar field equations J.Nonlin.Math.Phys.18:291-297,2011 arXiv: 0907.4053v2

## Lucasian chair again

13/02/2013

On 22 May, Professor Michael Green, the incumbent Lucasian Professor at Cambridge University, will be 67 and must retire. He succeeded Stephen Hawking that left this chair for the same reason on 2009. Well before Hawking’s retirement, Cambridge University issued an announcement asking for possible candidates and, after the selection ended, Professor Green come out as the chosen one. This time, no announcement is out from Cambridge and so, it is possible that the successor of Professor Green should be already known. I think the news will be released in the next few months. Till now, no rumors spread.

## Back to CUDA

11/02/2013

It is about two years ago when I wrote my last post about CUDA technology by NVIDIA (see here). At that time I added two new graphic cards to my PC, being on the verge to reach 3 Tflops in single precision for lattice computations.  Indeed, I have had an unlucky turn of events and these cards went back to the seller as they were not working properly and I was completely refunded. Meantime, also the motherboard failed and the hardware was largely changed  and so, I have been for a lot of time without the opportunity to work with CUDA and performing intensive computations as I planned. As it is well-known, one can find a lot of software exploiting this excellent technology provided by NVIDIA and, during these years, it has been spreading largely, both in academia and industry, making life of researchers a lot easier. Personally, I am using it also at my workplace and it is really exciting to have such a computational capability at your hand at a really affordable price.

Now, I am newly able to equip my personal computer at home with a powerful Tesla card. Some of these cards are currently dismissed as they are at the end of activity, due to upgrades of more modern ones, and so can be found at a really small price in bid sites like ebay. So, I bought a Tesla M1060 for about 200 euros. As the name says, this card has not been conceived for a personal computer but rather for servers produced by some OEMs. This can also be realized when we look at the card and see a passive cooler. This means that the card should have a proper physical dimension to enter into a server while the active dissipation through fans should be eventually provided by the server itself. Indeed, I added an 80mm Enermax fan to my chassis (also Enermax Enlobal)  to be granted that the motherboard temperature does not reach too high values. My motherboard is an ASUS P8P67 Deluxe. This is  a very good card, as usual for ASUS, providing three PCIe 2.0 slots and, in principle, one can add up to three video cards together. But if you have a couple of NVIDIA cards in SLI configuration, the slots work at x8. A single video card will work at x16.  Of course, if you plan to work with these configurations, you will need a proper PSU. I have a Cooler Master Silent Pro Gold 1000 W and I am well beyond my needs. This is what remains from my preceding configuration and is performing really well. I have also changed my CPU being this now an Intel i3-2125 with two cores at 3.30 GHz and 3Mb Cache. Finally, I added  16 Gb of Corsair Vengeance DDR3 RAM.

The installation of the card went really smooth and I have got it up and running in a few minutes on Windows 8 Pro 64 Bit,  after the installation of the proper drivers. I checked with Matlab 2011b and PGI compilers with CUDA Toolkit 5.0 properly installed. All worked fine. I would like to spend a few words about PGI compilers that are realized by The Portland Group. I have got a trial license at home and tested them while at my workplace we have a fully working license. These compilers make the realization of accelerated CUDA code absolutely easy. All you need is to insert into your C or Fortran code some preprocessing directives. I have executed some performance tests and the gain is really impressive without ever writing a single line of CUDA code. These compilers can be easily introduced into Matlab to yield mex-files or S-functions even if they are not yet supported by Mathworks (they should!) and also this I have verified without too much difficulty both for C and Fortran.

Finally, I would like to give you an idea on the way I will use CUDA technology for my aims. What I am doing right now is porting some good code for the scalar field and I would like to use it in the limit of large self-interaction to derive the spectrum of the theory. It is well-known that if you take the limit of the self-interaction going to infinity you recover the Ising model. But I would like to see what happens with intermediate but large values as I was not able to get any hint from literature on this, notwithstanding this is the workhorse for any people doing lattice computations. What seems to matter today is to show triviality at four dimensions, a well-acquired evidence. As soon as the accelerate code will run properly, I plan to share it here as it is very easy to get good code to do lattice QCD but it is very difficult to get good code for scalar field theory as well. Stay tuned!