## Standard Model at the horizon

08/12/2014

Hawking radiation is one of the most famous effects where quantum field theory combines successfully with general relativity. Since 1975 when Stephen Hawking uncovered it, this result has obtained a enormous consideration and has been derived in a lot of different ways. The idea is that, very near the horizon of a black hole, a pair of particles can be produced one of which falls into the hole and the other escapes to infinity and is seen as emitted radiation. The overall effect is to drain energy from the hole, as the pair is formed at its expenses, and its ultimate fate is to evaporate. The distribution of this radiation is practically thermal and a temperature and an entropy can be attached to the black hole. The entropy is proportional to the area of the black hole computed at the horizon, as also postulated by Jacob Bekenstein, and so, it can only increase. Thermodynamics applies to black holes as well. Since then, the quest to understand the microscopic origin of such an entropy has seen a huge literature production with the notable understanding coming from string theory and loop quantum gravity.

In all the derivations of this effect people generally assumes that the particles are free and there are very good reasons to do so. In this way the theory is easier to manage and quantum field theory on curved spaces yields definite results. The wave equation is separable and exactly solvable (see here and here). For a scalar field, if you had a self-interaction term you are immediately in trouble. Notwithstanding this, in  the ’80 Unruh and Leahy, considering the simplified case of two dimensions and Schwarzschild geometry, uncovered a peculiar effect: At the horizon of the black the interaction appears to be switched off (see here). This means that the original derivation by Hawking for free particles has indeed a general meaning but, the worst conclusion, all particles become non interacting and massless at the horizon when one considers the Standard Model! Cooper will have very bad times crossing Gargantua’s horizon.

Turning back from science fiction to reality, this problem stood forgotten for all this time and nobody studied this fact too much. The reason is that the vacuum in a curved space-time is not trivial, as firstly noted by Hawking, and mostly so when particles interact. Simply, people has increasing difficulties to manage the theory that is already complicated in its simplest form. Algebraic quantum field theory provides a rigorous approach to this (e.g. see here). These authors consider an interacting theory with a $\varphi^3$ term but do perturbation theory (small self-interaction) probably hiding in this way the Unruh-Leahy effect.

The situation can change radically if one has exact solutions. A $\varphi^4$ classical theory can be indeed solved exactly and one can make it manageable (see here). A full quantum field theory can be developed in the strong self-interaction limit (see here) and so, Unruh-Leahy effect can be accounted for. I did so and then, I have got the same conclusion for the Kerr black hole (the one of Interstellar) in four dimensions (see here). This can have devastating implications for the Standard Model of particle physics. The reason is that, if Higgs field is switched off at the horizon, all the particles will lose their masses and electroweak symmetry will be recovered. Besides, further analysis will be necessary also for Yang-Mills fields and I suspect that also in this case the same conclusion has to hold. So, the Unruh-Leahy effect seems to be on the same footing and importance of the Hawking radiation. A deep understanding of it would be needed starting from quantum gravity. It is a holy grail, the switch-off of all couplings, kind of.

Further analysis is needed to get a confirmation of it. But now, I am somewhat more scared to cross a horizon.

V. B. Bezerra, H. S. Vieira, & André A. Costa (2013). The Klein-Gordon equation in the spacetime of a charged and rotating black hole Class. Quantum Grav. 31 (2014) 045003 arXiv: 1312.4823v1

H. S. Vieira, V. B. Bezerra, & C. R. Muniz (2014). Exact solutions of the Klein-Gordon equation in the Kerr-Newman background and Hawking radiation Annals of Physics 350 (2014) 14-28 arXiv: 1401.5397v4

Leahy, D., & Unruh, W. (1983). Effects of a λΦ4 interaction on black-hole evaporation in two dimensions Physical Review D, 28 (4), 694-702 DOI: 10.1103/PhysRevD.28.694

Giovanni Collini, Valter Moretti, & Nicola Pinamonti (2013). Tunnelling black-hole radiation with $φ^3$ self-interaction: one-loop computation for Rindler Killing horizons Lett. Math. Phys. 104 (2014) 217-232 arXiv: 1302.5253v4

Marco Frasca (2009). Exact solutions of classical scalar field equations J.Nonlin.Math.Phys.18:291-297,2011 arXiv: 0907.4053v2

Marco Frasca (2013). Scalar field theory in the strong self-interaction limit Eur. Phys. J. C (2014) 74:2929 arXiv: 1306.6530v5

Marco Frasca (2014). Hawking radiation and interacting fields arXiv arXiv: 1412.1955v1

## That’s a Higgs but how many?

17/11/2014

CMS and ATLAS collaborations are yet up to work producing results from the datasets obtained in the first phase of activity of LHC. The restart is really near the corner and, maybe already the next summer, things can change considerably. Anyway what they get from the old data can be really promising and rather intriguing. This is the case for the recent paper by CMS (see here). The aim of this work is to see if a heavier state of Higgs particle exists and the kind of decay they study is $Zh\rightarrow l^+l^-bb$. That is, one has a signature with two leptons moving in opposite directions, arising from the dacy of the $Z$, and two bottom quarks arising from the decay of the Higgs particle. The analysis of this decay aims to get hints of existence of a heavier pseudoscalar Higgs state. This can be greatly important for SUSY extensions of the Standard Model that foresee more than one Higgs particle.

Often CMS presents its results with some intriguing open questions and also this is the case and so, it is worth this blog entry. Here is the main result

The evidence, as said in the paper, is that there is a 2.6-2.9 sigma evidence at 560 GeV and a smaller one at around 300 GeV. Look elsewhere effect reduces the former at 1.1 sigma and the latter is practically negligible. Overall, this is pretty negligible but, as always, with more data at the restart, could become something real or just fade away. It should be appreciated the fact that a door is left open anyway and a possible effect is pointed out.

My personal interpretation is that such higher excitations do exist but their production rates are heavily suppressed with the respect to the observed ground state at 126 GeV and so, negligible with the present datasets. I am also convinced that the current understanding of the breaking of SUSY, currently adopted in MSSM-like to go beyond the Standard Model, is not the correct one provoking the early death of such models. I have explained this in a coupled of papers of mine (see here and here). It is my firm conviction that the restart will yield exciting results and we should be really happy to have such a powerful machine in our hands to grasp them.

Marco Frasca (2013). Scalar field theory in the strong self-interaction limit Eur. Phys. J. C (2014) 74:2929 arXiv: 1306.6530v5

Marco Frasca (2012). Classical solutions of a massless Wess-Zumino model J.Nonlin.Math.Phys. 20:4, 464-468 (2013) arXiv: 1212.1822v2

## DICE 2014

21/09/2014

I have spent this week in Castiglioncello participating to the Conference DICE 2014. This Conference is organized with a cadence of two years with the main efforts due to Thomas Elze.

Castello Pasquini at Castiglioncello  (DICE 2014)

I have been a participant to the 2006 edition where I gave a talk about decoherence and thermodynamic limit (see here and here). This is one of the main conferences where foundational questions can be discussed with the intervention of some of the major physicists. This year there have been 5 keynote lectures from famous researchers. The opening lecture was held by Tom Kibble, one of the founding fathers of the Higgs mechanism. I met him at the registration desk and I have had the luck of a handshake and a few words with him. It was a recollection of the epic of the Standard Model. The second notable lecturer was Mario Rasetti. Rasetti is working on the question of big data that is, the huge number of information that is currently exchanged on the web having the property to be difficult to be managed and not only for a matter of quantity. What Rasetti and his group showed is that topological field theory yields striking results when applied to such a case. An application to NMRI for the brain exemplified this in a blatant manner.

The third day there were the lectures by Avshalom Elitzur and Alain Connes, the Fields medallist. Elitzur is widely known for the concept of weak measurement that is a key idea of quantum optics. Connes presented his recent introduction of the quanta of geometry that should make happy loop quantum gravity researchers. You can find the main concepts here. Connes explained how the question of the mass of the Higgs got fixed and said that, since his proposal for the geometry of the Standard Model, he was able to overcome all the setbacks that appeared on the way. This was just another one. From my side, his approach appears really interesting as the Brownian motion I introduced in quantum mechanics could be understood through the quanta of volumes that Connes and collaborators uncovered.

Gerard ‘t Hooft talked on Thursday. The question he exposed was about cellular automaton and quantum mechanics (see here). It is several years that ‘t Hoof t is looking for a classical substrate to quantum mechanics and this was also the point of other speakers at the Conference. Indeed, he has had some clashes with people working on quantum computation as ‘t Hooft, following his views, is somewhat sceptical about it. I intervened on this question based on the theorem of Lieb and Simon, generally overlooked in such discussions, defending ‘t Hoof ideas and so, generating some fuss (see here and the discussion I have had with Peter Shor and Aram Harrow). Indeed, we finally stipulated that some configurations can evade Lieb and Simon theorem granting a quantum behaviour at macroscopic level.

This is my talk at DICE 2014 and was given the same day as that of  ‘t Hooft (he was there listening). I was able to prove the existence of fractional powers of Brownian motion and presented new results with the derivation of the Dirac equation from a stochastic process.

The Conference was excellent and I really enjoyed it. I have to thank the organizers for the beautiful atmosphere and the really pleasant stay with a full immersion in wonderful science. All the speakers yielded stimulating and enjoyable talks. For my side, I will keep on working on foundational questions and look forward for the next edition.

Marco Frasca (2006). Thermodynamic Limit and Decoherence: Rigorous Results Journal of Physics: Conference Series 67 (2007) 012026 arXiv: quant-ph/0611024v1

Ali H. Chamseddine, Alain Connes, & Viatcheslav Mukhanov (2014). Quanta of Geometry arXiv arXiv: 1409.2471v3

Gerard ‘t Hooft (2014). The Cellular Automaton Interpretation of Quantum Mechanics. A View on the Quantum Nature of our Universe, Compulsory or Impossible? arXiv arXiv: 1405.1548v2

## The question of the mass gap

10/09/2014

Some years ago I proposed a set of solutions to the classical Yang-Mills equations displaying a massive behavior. For a massless theory this is somewhat unexpected. After a criticism by Terry Tao I had to admit that, for a generic gauge, such solutions are just asymptotic ones assuming the coupling runs to infinity (see here and here). Although my arguments on Yang-Mills theory were not changed by this, I have found such a conclusion somewhat unsatisfactory. The reason is that if you have classical solutions to Yang-Mills equations that display a mass gap, their quantization cannot change such a conclusion. Rather, one should eventually expect a superimposed quantum spectrum. But working with asymptotic classical solutions can make things somewhat involved. This forced me to choose the gauge to be always Lorenz because in such a case the solutions were exact. Besides, it is a great success for a physicist to find exact solutions to fundamental equations of physics as these yield an immediate idea of what is going on in a theory. Even in such case we would get a conclusive representation of the way the mass gap can form.

Finally, after some years of struggle, I was able to get such a set of exact solutions to the classical Yang-Mills theory displaying a mass gap (see here). Such solutions confirm both the Tao’s argument that an all equal component solution for Yang-Mills equations cannot hold in any gauge and also my original argument that an all equal component solution holds, in a general case, only asymptotically with the coupling running to infinity. But classically, there exist solutions displaying a mass gap that arises from the nonlinearity of the equations of motion. The mass gap goes to zero as the coupling does. Translating this in the quantum realm is straightforward as I showed for the Lorenz (Landau) gauge. I hope all this will help to better elucidate all the physics around strong interactions. My efforts since 2005 went in that direction and are still going on.

Marco Frasca (2009). Mapping a Massless Scalar Field Theory on a Yang-Mills Theory: Classical Case Mod. Phys. Lett. A 24, 2425-2432 (2009) arXiv: 0903.2357v4

Marco Frasca (2014). Exact solutions for classical Yang-Mills fields arXiv arXiv: 1409.2351v1

## Higgs what?

06/09/2014

In these days it has been announced the new version of Review of Particle Physics by the Particle Data Group (PDG). This is the bread and butter of any particle physicist and contains all the relevant data about this area of research. It is quite common for us to search the on-line version or using the booklet to know a mass or a decay rate. After the first run of LHC data gathering about Higgs particle, this edition contains a bunch of fundamental informations about it and I post a part of them.

It is Standard Model Higgs! No, not so fast. Take a look at the WW final state. It is somewhat low but yes, it is perfectly consistent with the Standard Model. Also, error bars are somewhat large to conclude something definitive. So, let us take a look nearer at these strengths.

We discover that the strengths measured by CMS are really low and takes down this value. Indeed, this is consistent with my proposal here. I get 0.68 for both channels WW and ZZ. On the other side, ATLAS moves all upward consistently and there is this strange behaviour compensating each other. So, let us also take a look at the ZZ strength. PDG yields

again CMS agrees with my conclusions and ATLAS moves all upward to compensate. But both these results, due to the large error bars, agree rather well with the Standard Model. So, I looked for the publication by CMS  that were produced till today if one or both these analyses were improved. The result was that CMS improved the measure of the strength in the WW channel to leptons (see here). What they measure is

$\frac{\sigma}{\sigma_{SM}}=0.72^{+0.20}_{-0.18}.$

The error is significantly smaller and the result striking. It is bending in the “wrong” turn loosing higgsness. It would be interesting to understand why CMS appear to get results downward for these strengths and ATLAS more upward compensating each other toward the Standard Model. On the other side, I should admire the more aggressive approach by CMS with their results more and more similar to my expectations. I am just curious to see with the restart of LHC what will happen to these data that CMS sharpened to such a point.

Marco Frasca (2013). Scalar field theory in the strong self-interaction limit Eur. Phys. J. C (2014) 74:2929 arXiv: 1306.6530v5

CMS Collaboration (2013). Measurement of Higgs boson production and properties in the WW decay
channel with leptonic final states JHEP 01 (2014) 096 arXiv: 1312.1129v2

## Do quarks grant confinement?

21/07/2014

In 2010 I went to Ghent in Belgium for a very nice Conference on QCD. My contribution was accepted and I had the chance to describe my view about this matter. The result was this contribution to the proceedings. The content of this paper was really revolutionary at that time as my view about Yang-Mills theory, mass gap and the role of quarks was almost completely out of track with respect to the rest of the community. So, I am deeply grateful to the Organizers for this opportunity. The main ideas I put forward were

• Yang-Mills theory has an infrared trivial fixed point. The theory is trivial exactly as the scalar field theory is.
• Due to this, gluon propagator is well-represented by a sum of weighted Yukawa propagators.
• The theory acquires a mass gap that is just the ground state of a tower of states with the spectrum of a harmonic oscillator.
• The reason why Yang-Mills theory is trivial and QCD is not in the infrared limit is the presence of quarks. Their existence moves the theory from being trivial to asymptotic safety.

These results that I have got published on respectable journals become the reason for rejection of most of my successive papers from several referees notwithstanding there were no serious reasons motivating it. But this is routine in our activity. Indeed, what annoyed me a lot was a refeee’s report claiming that my work was incorrect because the last of my statement was incorrect: Quark existence is not a correct motivation to claim asymptotic safety, and so confinement, for QCD. Another offending point was the strong support my approach was giving to the idea of a decoupling solution as was emerging from lattice computations on extended volumes. There was a widespread idea that the gluon propagator should go to zero in a pure Yang-Mills theory to grant confinement and, if not so, an infrared non-trivial fixed point must exist.

Recently, my last point has been vindicated by a group that was instrumental in the modelling of the history of this corner of research in physics. I have seen a couple of papers on arxiv, this and this, strongly supporting my view. They are Markus Höpfer, Christian Fischer and Reinhard Alkofer. These authors work in the conformal window, this means that, for them, lightest quarks are massless and chiral symmetry is exact. Indeed, in their study quarks not even get mass dynamically. But the question they answer is somewhat different: Acquired the fact that the theory is infrared trivial (they do not state this explicitly as this is not yet recognized even if this is a “duck” indeed), how does the trivial infrared fixed point move increasing the number of quarks? The answer is in the following wonderful graph with $N_f$ the number of quarks (flavours):

From this picture it is evident that there exists a critical number of quarks for which the theory becomes asymptotically safe and confining. So, quarks are critical to grant confinement and Yang-Mills theory can happily be trivial. The authors took great care about all the involved approximations as they solved Dyson-Schwinger equations as usual, this is always been their main tool, with a proper truncation. From the picture it is seen that if the number of flavours is below a threshold the theory is generally trivial, so also for the number of quarks being zero. Otherwise, a non-trivial infrared fixed point is reached granting confinement. Then, the gluon propagator is seen to move from a Yukawa form to a scaling form.

This result is really exciting and moves us a significant step forward toward the understanding of confinement. By my side, I am happy that another one of my ideas gets such a substantial confirmation.

Marco Frasca (2010). Mapping theorem and Green functions in Yang-Mills theory PoS FacesQCD:039,2010 arXiv: 1011.3643v3

Markus Hopfer, Christian S. Fischer, & Reinhard Alkofer (2014). Running coupling in the conformal window of large-Nf QCD arXiv arXiv: 1405.7031v1

Markus Hopfer, Christian S. Fischer, & Reinhard Alkofer (2014). Infrared behaviour of propagators and running coupling in the conformal
window of QCD arXiv arXiv: 1405.7340v1

## f0(500) and f0(980) are not tetraquarks

27/06/2014

Last week I have been in Giovinazzo, a really beautiful town near Bari in Italy. I participated at the QCD@Work conference. This conference series is now at the 7th edition and, for me, it was my second attendance. The most striking news I heard was put forward in the first day and represents a striking result indeed. The talk was given by Maurizio Martinelli on behalf of LHCb Collaboration. You can find the result on page 19 and on an arxiv paper . The question of the nature of f0(500) is a vexata quaestio since the first possible observation of this resonance. It entered in the Particle Data Group catalog as f0(600) but was eliminated in the following years. Today its existence is no more questioned and this particle is widely accepted. Also its properties as the mass and the width are known with reasonable precision starting from a fundamental work by Irinel Caprini, Gilberto Colangelo and Heinrich Leutwyler (see here). The longstanding question around this particle and its parent f0(980) was about their nature. It is generally difficult to fix the structure of a resonance in QCD and there is no exception here.

The problem arose from famous papers by Jaffe on 1977 (this one and this one) that using a quark-bag model introduced a low-energy nonet of states made of four quarks each. These papers set the stage for what has been the current understanding of the f0(500) and f0(980) resonances. The nonet is completely filled with all the QCD resonances below 1 GeV and so, it seems to fit the bill excellently.

Someone challenged this kind of paradigm and claimed that f0(500) could not be a tetraquark state (e.g. see here and here but also papers by Wolfgang Ochs and Peter Minkowski disagree with the tetraquark model for these resonances). The answer come out straightforwardly from LHCb collaboration: Both f0(500) and f0(980) are not tetraquark and the original view by Jaffe is no more supported. Indeed, people that know the Nambu-Jona-Lasinio model should know quite well where the f0(500) (or $\sigma$ ) comes from and I would also suggest that this model can also accommodate higher states like f0(980).

I should say that this is a further striking result coming from LHCb Collaboration. Hopefully, this should give important hints to a better understanding of low-energy QCD.

$\overline{B}^0\rightarrow J/ψπ^+π^-$ decays arXiv arXiv: 1404.5673v2