Of course, Alcubierre’s solution is rather interesting from a physical point of view as it belongs to a number of older solutions, like wormholes, time machines and like that, yielded by very famous authors as Kip Thorne, that arise when one impose a solution and then check the conditions of its existence. This turns out to be a determination of the energy-momentum tensor and, unavoidably, is negative. Then, they violate whatever energy condition of the Einstein equations granting pathological behaviour. On the other side, they appear the most palatable for science fiction of possible futures of space and time travels. In these times where this kind of technologies are largely employed by the film industry, moving the fantasy of millions, we would hope that such futures should also be possible.

It is interesting to note the procedure to obtain these particular solutions. One engineers it on a desk and then substitute them into the Einstein equations to see when are really a solution. One fixes in this way the energy requirements. On the other side, it is difficult to come out from the blue with a solution of the Einstein equations that provides such a particular behaviour, moving the other way around. It is also possible that such solutions are not possible and imply always a violation of the energy conditions. Some theorems have been proved in the course of time that seem to prohibit them (e.g. see here). Of course, I am convinced that the energy conditions must be respected if we want to have the physics that describes our universe. They cannot be evaded.

So, turning at the question of the title, could we think of a possible warp drive solution of the Einstein equations without exotic matter? The answer can be yes of course provided we are able to recover the York time, or warp factor, in the way Alcubierre obtained it with its pathological solution. At first, this seems an impossible mission. But the space-time bubble we are considering is a very small perturbation and perturbation theory can come to rescue. Particularly, when this perturbation can be locally very strong. On 2005, I proposed such a solution (see here) together with a technique to solve the Einstein equations when the metric is strongly perturbed. My intent at that time was to give a proof of the BKL conjecture. A smart referee suggested to me to give an example of application of the method. The metric I have obtained in this way, perturbing a Schwarzaschild metric, yields a solution that has an identical York time (warp factor) as for the Alcubierre’s metric. Of course, I am respecting energy conditions as I am directly solving the Einstein equations that do.

The identity between the York times can be obtained provided the form factor proposed by Alcubierre is taken to be 1 but this is just the simplest case. Here is an animation of my warp factor.

It seen the bubble moving as expected along the x direction.

My personal hope is that this will go beyond a mathematical curiosity. On the other side, it should be understood how to provide such kind of perturbations to a given metric. I can think to the Einstein-Maxwell equations solved using perturbation theory. There is a lot of literature about and a lot of great contributions on this argument.

Finally, this could give a meaning to the following video by NASA.

]]>About the Higgs particle, after the important announcement about the existence of the ttH process, both ATLAS and CMS are pursuing further their improvement of precision. About the signal strength they give the following results. For ATLAS (see here)

and CMS (see here)

The news is that the error is diminished and both agrees. They show a small tension, 13% and 17% respectively, but the overall result is consistent with the Standard Model.

When the different contributions are unpacked in the respective contributions due to different processes, CMS claims some tensions in the WW decay that should be taken under scrutiny in the future (see here). They presented the results from data and so, there is no significant improvement, for the moment, with respect to Moriond conference this year. The situation is rather better for the ZZ decay where no tension appears and the agreement with the Standard Model is there in all its glory (see here). Things are quite different, but not too much, for ATLAS as in this case they observe some tensions but these are all below (see here). For the WW decay, ATLAS does not see anything above (see here).

So, although there is something to take under attention with the increase of data, that will reach this year, but the Standard Model is in good health with respect to the Higgs sector even if there is a lot to be answered yet and precision measurements are the main tool. The correlation in the tt pair is absolutely promising and we should hope this will be confirmed a discovery.

]]>

This paper contains the exact beta function of a Yang-Mills theory. This confirms that confinement arises by the combination of the running coupling and the propagator. This idea was around in some papers in these latter years. It emerged as soon as people realized that the propagator by itself was not enough to grant confinement, after extended studies on the lattice.

It is interesting to point out that confinement is rooted in the BRST invariance and asymptotic freedom. The Kugo-Ojima confinement criterion permits to close the argument in a rigorous way yielding the exact beta funtion of the theory.

]]>In a communicate to the public (see here), CERN finally acknowledge, for the first time, a significant discrepancy between data from CMS and Standard Model for the signal strengths in the Higgs decay channels. They claim a 17% difference. This is what I advocated for some years and I have published in reputable journals. I will discuss this below. I would like only to show you the CMS results in the figure below.

ATLAS, by its side, is seeing significant discrepancy in the ZZ channel () and a compatibility for the WW channel. Here are their results.

On the left the WW channel is shown and on the right there are the combined and ZZ channels.

The reason of the discrepancy is due, as I have shown in some papers (see here, here and here), to the improper use of perturbation theory to evaluate the Higgs sector. The true propagator of the theory is a sum of Yukawa-like propagators with a harmonic oscillator spectrum. I solved exactly this sector of the Standard Model. So, when the full propagator is taken into account, the discrepancy is toward an increase of the signal strength. Is it worth a try?

This means that this is not physics beyond the Standard Model but, rather, the Standard Model in its full glory that is teaching something new to us about quantum field theory. Now, we are eager to see the improvements in the data to come with the new run of LHC starting now. In the summer conferences we will have reasons to be excited.

]]>On Friday, the last day of conference, I posted the following twitter after attending the talk by Shunsuke Honda on behalf of ATLAS at QCD 17:

and the reason was this slide

The title of the talk was “Cross sections and couplings of the Higgs Boson from ATLAS”. As you can read from it, there is a deviation of about 2 sigmas from the Standard Model for the Higgs decaying to ZZ(4l) for VBF. Indeed, they can claim agreement yet but it is interesting anyway (maybe are we missing anything?). The previous day at EPSHEP 2017, Ruchi Gupta on behalf of ATLAS presented an identical talk with the title “Measurement of the Higgs boson couplings and properties in the diphoton, ZZ and WW decay channels using the ATLAS detector” and the slide was the following:

The result is still there but with a somewhat sober presentation. What does this mean? Presently, this amounts to very few. We are still within the Standard Model even if something seems to peep out. In order to claim a discovery, this effect should be seen with a lower error and at CMS too. The implications would be that there could be a more complex spectrum of the Higgs sector with a possible new understanding of naturalness if such a spectrum would not have a formal upper bound. People at CERN promised more data coming in the next weeks. Let us see what will happen to this small effect.

]]>When a theory is too hard to solve people try to consider lower dimensional cases. This also happened for Yang-Mills theory. The four dimensional case is notoriously difficult to manage due to the large coupling and the three dimensional case has been treated both theoretically and by lattice computations. In this latter case, the ground state energy of the theory is known very precisely (see here). So, a sound theoretical approach from first principles should be able to get that number at the same level of precision. We know that this is the situation for Standard Model with respect to some experimental results but a pure Yang-Mills theory has not been seen in nature and we have to content ourselves with computer data. The reason is that a Yang-Mills theory is realized in nature just in interaction with other kind of fields being these scalars, fermions or vector-like.

In these days, I have received the news that my paper on three dimensional Yang-Mills theory has been accepted for publication in the European Physical Journal C. Here is tha table for the ground state for SU(N) at different values of N compared to lattice data

**N** **Lattice** **Theoretical** **Error **

**2** 4.7367(55) 4.744262871 0.16%

**3** 4.3683(73) 4.357883714 0.2%

**4** 4.242(9) 4.243397712 0.03%

**∞** 4.116(6) 4.108652166 0.18%

These results are strikingly good and the agreement is well below 1%. This in turn implies that the underlying theoretical derivation is sound. Besides, the approach proves to be successful both also in four dimensions (see here). My hope is that this means the beginning of the era of high precision theoretical computations in strong interactions.

Andreas Athenodorou, & Michael Teper (2017). SU(N) gauge theories in 2+1 dimensions: glueball spectra and k-string tensions J. High Energ. Phys. (2017) 2017: 15 arXiv: 1609.03873v1

Marco Frasca (2016). Confinement in a three-dimensional Yang-Mills theory arXiv arXiv: 1611.08182v2

Marco Frasca (2015). Quantum Yang-Mills field theory Eur. Phys. J. Plus (2017) 132: 38 arXiv: 1509.05292v2

]]>]]>

Exact solutions of quantum field theories are very rare and, normally, refer to toy models and pathological cases. Quite recently, I put on arxiv a pair of papers presenting exact solutions both of the Higgs sector of the Standard Model and the Yang-Mills theory made just of gluons. The former appeared a few month ago (see here) while the latter has been accepted for publication a few days ago (see here). I have updated the latter just today and the accepted version will appear on arxiv on 2 January next year.

What does it mean to solve exactly a quantum field theory? A quantum field theory is exactly solved when we know all its correlation functions. From them, thanks to LSZ reduction formula, we are able to compute whatever observable in principle being these cross sections or decay times. The shortest way to correlation functions are the Dyson-Schwinger equations. These equations form a set with the former equation depending on the higher order correlators and so, they are generally very difficult to solve. They were largely used in studies of Yang-Mills theory provided some truncation scheme is given or by numerical studies. Their exact solutions are generally not known and expected too difficult to find.

The problem can be faced when some solutions to the classical equations of motion of a theory are known. In this way there is a possibility to treat the Dyson-Schwinger set. Anyhow, before to enter into their treatment, it should be emphasized that in literature the Dyson-Schwinger equations where managed just in one way: Using their integral form and expressing all the correlation functions by momenta. It was an original view by Carl Bender that opened up the way (see here). The idea is to write the Dyson-Schwinger equations into their differential form in the coordinate space. So, when you have exact solutions of the classical theory, a possibility opens up to treat also the quantum case!

This shows unequivocally that a Yang-Mills theory can display a mass gap and an infinite spectrum of excitations. Of course, if nature would have chosen the particular ground state depicted by such classical solutions we would have made bingo. This is a possibility but the proof is strongly related to what is going on for the Higgs sector of the Standard Model that I solved exactly but without other matter interacting. If the decay rates of the Higgs particle should agree with our computations we will be on the right track also for Yang-Mills theory. Nature tends to repeat working mechanisms.

Marco Frasca (2015). A theorem on the Higgs sector of the Standard Model Eur. Phys. J. Plus (2016) 131: 199 arXiv: 1504.02299v3

Marco Frasca (2015). Quantum Yang-Mills field theory arXiv arXiv: 1509.05292v1

Carl M. Bender, Kimball A. Milton, & Van M. Savage (1999). Solution of Schwinger-Dyson Equations for ${\cal PT}$-Symmetric Quantum Field Theory Phys.Rev.D62:085001,2000 arXiv: hep-th/9907045v1

]]>I hope this will go properly evaluated by the scientific community moving to a more serious addressing of this effect.

**Update**: The links were removed from the subreddit’s moderator. I have copies of these files but I do not mean to publish them in any form.

**Update**: Here is the published paper.

ATLAS and CMS nuked our illusions on that bump. More than 500 papers were written on it and some of them went through Physical Review Letters. Now, we are contemplating the ruins of that house of cards. This says a lot about the situation in hep in these days. It should be emphasized that people at CERN warned that that data were not enough to draw a conclusion and if they fix the threshold at a reason must exist. But carelessness acts are common today if you are a theorist and no input from experiment is coming for long.

It should be said that the fact that LHC could confirm the Standard Model and nothing else is one of the possibilities. We should hope that a larger accelerator could be built, after LHC decommissioning, as there is a long way to the Planck energy that we do not know how to probe yet.

What does it remain? I think there is a lot yet. My analysis of the Higgs sector is still there to be checked as I will explain in a moment but this is just another way to treat the equations of the Standard Model, not beyond it. Besides, for the end of the year they will reach , almost triplicating the actual integrated luminosity and something interesting could ever pop out. There are a lot of years of results ahead and there is no need to despair. Just to wait. This is one of the most important activities of a theorist. Impatience does not work in physics and mostly for hep.

About the signal strength, things seem yet too far to be settled. I hope to see better figures for the end of the year. ATLAS is off the mark, going well beyond unity for WW, as happened before. CMS claimed for WW decay, worsening their excellent measurement of reached in Run I. CMS agrees fairly well with my computations but I should warn that the error bar is yet too large and now is even worse. I remember that the signal strength is obtained by the ratio of the measured cross section to the one obtained from the Standard Model. The fact that is smaller does not necessarily mean that we are beyond the Standard Model but that we are just solving the Higgs sector in a different way than standard perturbation theory. This solution entails higher excitations of the Higgs field but they are strongly depressed and very difficult to observe now. The only mark could be the signal strength for the observed Higgs particle. Finally, the ZZ channel is significantly less sensible and error bars are so large that one can accommodate whatever she likes yet. Overproduction seen by ATLAS is just a fluctuation that will go away in the future.

The final sentence to this post is what we have largely heard in these days: Standard Model rules.

]]>