Wednesday 24 October 2012

Higgs: New Deal

The new round of Higgs data will be presented on the 15th of November at a conference in Kyoto, and on blogs a few days earlier. The amount of data will increase by about 2/3 compared to what was available last summer.  This means the errors should naively drop by 30%, or a bit more in the likely case of some improvements in the analyses. Here's a short guide to the hottest Higgs questions that may be answered.

  • Will the γγ rate remain high? 
    Last summer the Higgs boson showed up quite like predicted by the standard model. The most  intriguing  discrepancy was that both ATLAS and CMS saw too many Higgs decays to photon pairs, exceeding by 80% and 60% respectively the standard model expectation.  Statistically speaking, the excess in both experiments is below 2 sigma, so at this point all the observed rates are in a decent agreement with the standard model. But that doesn't stop of us from dreaming and crossing our fingers. If the excess is a statistical fluke we would expect that the central value of the measured H→γγ  rate will decrease, and that the significance of the excess will remain moderate. But if,  purely hypothetically,  the central value remains high and the  significance of the excess grows then.... well, then  it's gonna get hot.  
  • Will the ττ rate remain low?
    Another puzzling piece of Higgs data from last summer was that CMS failed to see any excess in the H→τ+τ- channel, despite their expected sensitivity being close to the predicted standard model rate. In fact, they came close to excluding the 125 GeV standard model Higgs in that channel! This discrepancy carries less weight than the diphoton excess because it is reported by only one experiment (ATLAS did not update the ττ channel with 8 TeV data last summer) and because the strong limit seems to be driven by a large negative background fluctuation in one of the search categories.  Nevertheless, it is conceivable that something interesting is cooking here. In 3 weeks  both experiments should speak up with a clearer voice, and the statistics should be high enough to get a feeling what's going on.
  • Is the Vh → bb rate enhanced? 
    The LHC has proven that Higgs couples to bosons: gluons, photons, W and Z, however it has not pinpointed the couplings to fermions yet (except indirectly, since the effective coupling to gluons is likely mediated by virtual top quarks). As mentioned above, no sign of Higgs decays to tau lepton pairs has been detected so far. Also, the LHC has not seen any clear signs of Higgs decays to b-quarks (even though it is probably the most frequent decay mode). On the other hand, the Tevatron experiments in their dying breath have reported a 3 sigma evidence for the h → bb decays, with the Higgs produced in association with the W or Z boson. The intriguing (or maybe suspicious) aspect of the Tevatron result was that the observed rate was twice that predicted by the standard model. In 3 weeks the sensitivity of the LHC in the b-bbar channel should exceed that of the Tevatron. It is unlikely that  we'll get a clear evidence for h→bb decays then, but at least we should learn whether the Tevatron hints of enhanced Vh → bb can be true.
  • Will they see h→Zγ?
    Another possible channel to observe the Higgs boson is via its decay to 1 photon and 1 Z boson, where Z subsequently decays to a pair of charged leptons. Much like in the well-studied h→ZZ→4l and h→γγ channel, the kinematics of the    h→Zγ→γ2l decay can be cleanly reconstructed and offers a good  Higgs mass resolution.  The problem is the low rate: the Higgs decay to Zγ is even more rare than that to γγ, plus one needs to pay the penalty of the low branching fraction for the Z→l+l- decay. According to the estimates I'm aware of, the LHC is not yet sensitive to the  h→Zγ produced with the standard model rate. However, if we assume it's new physics that's boosting the  h→γγ rate, it is very likely that the h→Zγ rate is also boosted by a similar or a larger factor. Thus, it interesting to observe what limits can the LHC deliver in the  h→Zγ channel, as they may provide non-trivial constraints on new physics.  
  • Does Higgs have spin zero?
    Obviously, this question carries a similar potential for surprise as a football game between Brazil and Tonga. Indeed, spin-1 is disfavored on theoretical grounds (an on-shell spin-1 particle cannot decay to two photons), while a spin-2 particle cannot by itself ensure the consistency of electroweak symmetry breaking as the Higgs boson does. Besides, we already know the 125 GeV particle couples to the W and Z bosons, gluons and photons with roughly the strength of  the standard model Higgs boson. It would be an incredible coincidence if a particle with another spin or parity than the Higgs would reproduce the event rates observed at the LHC, given the tensor structure of the couplings are completely different for other spins. Nevertheless, a clear experimental preference for spin-0 would be useful to satisfy some pedantic minds or some Nobel committees. In particular, one needs to demonstrate that the Higgs boson is produced isotropically (without a preferred direction) in the center-of-mass frame of the collision. With the present statistics it should already be possible to discriminate between spin-0 and alternative hypotheses.  

So, keep your ear to the ground, the data are being unblinded as we speak, and the first numbers are already being bandied about in cafeterias and on facebook. Intriguingly, this blog post clearly hints there is a lot to rumor about in the new data ;-) Is it the high γγ rate? The low ττ rate? Something else? Well, there's still 3 weeks left and the numbers may shift a bit, so let's not spoil the fun just yet... In any case, if you have an experimentalist friend now it's the best time to invite her to a drink or to dances ;-)

Wednesday 10 October 2012

A problem with bees

This one is not about the colony collapse disorder but about particle bees, also known as b-quarks. Older readers who still remember the LEP collider may also remember a long-standing anomaly in one of the LEP precision measurement. The observable in question is the forward-backward asymmetry of the b-quark production in electron-positron collisions. In the events with a pair of b-jets in the final state one counts the number of b-quarks (as opposed to b-anti-quarks) going in the forward and backward directions (defined by the electron and positron beam directions), and then defines the asymmetry as:

The observable is analogous to the top forward-backward asymmetry, widely discussed in the context of the anomalous Tevatron measurements, although the origin of the 2 anomalies is unlikely to be directly related. At LEP, the b-quark pair production is mediated mostly by a photon or a Z-boson in the s-channel. The latter has chiral couplings to matter, that is to say, it couples differently to left- and right-handed particles. Thanks to that, a significant b-quark asymmetry of order 10% is predicted in the standard model. However, the asymmetry observed at LEP was slightly smaller than predicted. The anomaly, sitting in the 3 sigma ballpark, has attracted some attention but has never been viewed as a smoking-gun of new physics. Indeed, it was just one anomaly in the sea of LEP observables that perfectly matched the standard model predictions. In particular, another b-quark precision observable measured at LEP  - the production rate of b-quark pairs,  the so-called Rb - seemed to be in perfect agreement with the standard model.  New physics models explaining the data involved a certain level of conspiracy: one had to arrange things such that the asymmetry but not the overall rate was affected. 

Fast forward to the year 2012. The Gfitter group posted an update of the standard model fits to the electroweak precision observables. One good reason to look at the update is that, as of this year, the standard model has no longer any free parameters that haven't been directly measured: the Higgs mass, on which several precision observables depend via loop effects, has been pinpointed by ATLAS and CMS to better than 1%.  But there's more than that. One notices that, although most precision observables perfectly fit the standard model, there are two measurements that stand out above 2 sigma. Wait, two measurements?  Right, according to the latest fits not only the b-quark asymmetry but also the b-quark production rate at LEP deviates from  the standard model prediction at the level of 2.5 sigma.   

The data hasn't changed of course. Also, the new discrepancy is not due to including the Higgs mass measurement, as that lies very close to the previous indirect determinations via electroweak fits. What happened is that the theory prediction has migrated.  More precisely, 2-loop electroweak corrections to Rb computed recently turned out to be significant and moved the theoretical prediction down. Thus, the value  of Rb measured at LEP is, according to the current interpretation, larger than predicted by the standard model. The overall goodness of the standard model fit has decreased, with the current p-value around 7%. 

Can this be a hint of new physics?  Actually, it's trivial to  explain the anomalies in a model-independent way. It is enough to  assume that the coupling of the Z-boson to b-quarks deviates from the standard model value:

In the standard model gLb ≈ -0.4, and gRb ≈ 0.08, and  δgLb = δgRb = 0. Given two additional parameters δgLb and δgRb we have enough freedom to account for both the b-quark anomalies.  The fit from this paper shows that one needs an upward shift of the right-handed coupling by 10-30%, possibly but not necessarily accompanied by a tiny (less than 1%) shift of the left-handed coupling. This sort of  modification is easy to get in some concrete scenarios beyond the standard model, for example in the Randall-Sundrum-type models with the right-handed b-quark localized near the IR brane.

So, maybe, LEP has seen a hint of compositeness of right-handed b-quarks? Well, one more 2.5 sigma anomaly does not make a summer; overall the standard model is still in a good shape.  However it's intriguing that both b-quark-related LEP precision observables do not quite agree with the standard model. Technically, modifying both AFB and Rb is much  more natural from the point of view of new physics interpretations. So I guess it may be worth,  without too much excitement but with some polite interest, to follow the  news on  B' searches at the LHC.

Important update: Unfortunately, the calculation of Rb referred to in this post later turned out to be erroneous. After correcting the bug, Rb is less than 1 sigma away from the standard model prediction.

Tuesday 9 October 2012

Swedes playing Russian Roulette

No particle physicist received a phone call from Stockholm today. There had been some expectations for an award honoring the Higgs discovery. Well, it was maybe naive but not completely unrealistic to think that the Nobel committee might want to reestablish some connection with the original Nobel's will (which, anecdotally, awarded prizes for discoveries made during the preceding year). To ease my disappointment, let me write about a purely probabilistic but potentially gruesome aspect of today's decision. Warning: the discussion below is a really bad taste; don't even start reading unless Borat is among your favorite movies!

Peter Higgs is 83, and François Englert is almost 80.  Taking the US data on lifetime expectancy as the reference, they have respectively 9% and 6% probability to pass away within a year from now.  Thus, the probability of at least one of them being gone by the time of the next announcement is approximately 14%! To give an everyday analogy, it's only a tad safer than playing Russian Roulette with 1 bullet in a 6-shot colt revolver. The probability grows to stunning 27% if one includes Philip Anderson among the potential recipients (nearly 89, 15%). Obviously, the probability curve is steeply rising as a function of t, and approaches 100% for the typical Nobel recognition time lag.     

Well, the Nobel for the Higgs discovery will be awarded sooner or later. Even if one of the crucial actors does not make it, the prestige of the physics Nobel prize won't be hurt too much (it has  survived far more serious embarrassments). But, that would be just sad and unjust, even more so than the Cabibbo story. So why not make it rather sooner than later?    

Here's is more on the dangers of playing Russian Roulette:

Tuesday 2 October 2012

What's the deal with vacuum stability?

This year we learned that the Higgs mass is 125.5 GeV, give or take 1 GeV.  As a consequence, we learned that God plays not only dice but also russian roulette. In other words, that life is futile because everything we cherish and hold dear will decay.  In other words, that the vacuum of the standard model is not stable.

Before we continue, keep in mind the important disclaimer:
All this discussion is valid assuming the standard model is the correct theory all the way up to the Planck scale, which is unlikely. 
Indeed, while it's very likely that the standard model is an adequate description of physics at the energies probed by the LHC, we have no compelling reasons to assume it works at, say, 100 TeV. On the contrary, we know there should be some new particles somewhere, at least to account for dark matter and the baryon asymmetry in the universe, and those degrees of freedom may well affect the discussion of vacuum stability. But for the time being let's assume there's no new particles beyond the standard model with a significant coupling to the Higgs field.      

The stability of our vacuum depends on the sign of the quartic coupling in the λ |H|^4 term in the Higgs potential: for negative λ the potential is unbounded from below and therefore unstable. We know exactly the value of  λ at the weak scale: from the Higgs mass 125 GeV and the expectation value 246 GeV it follows that λ = 0.13, positive of course.  But panta rhei and λ is no exception. At  large values of |H|, the Higgs potential in the standard model is, to a good approximation,  given by   λ(|H|) |H|^4 where λ(|H|) is the running coupling evaluated at the scale |H|.  If Higgs were decoupled from the rest of matter then  λ would grow with the energy scale and would eventually explode into a Landau pole. However, the Yukawa couplings of the Higgs boson to fermions provide another contribution to the evolution equations that works toward decreasing λ at large energies. In the standard  model the top Yukawa coupling is large, of order 1, while the Higgs self-coupling is moderate, so Yukawa wins.

In the plot showing the evolution of  λ  in the standard model  (borrowed from the latest state-of-the-art paper) one can see that at the scale of about 10 million TeV the Higgs self-coupling becomes negative. That sounds like a catastrophe as it naively means that the Higgs potential is unbounded from below. However, we can reliably use quantum field theory only up to the Planck scale, and one can assume that some unspecified physics near the Planck scale (for example, |H|^6 and higher terms in the potential) restore the boundedness of the Higgs potential. Still, between 10^10 and 10^19 GeV the potential is negative and therefore it has a global minimum at large |H| that is much deeper than the vacuum we live in. As a consequence, the path integral will receive contributions from the field configurations interpolating between the two vacua, leading to a non-zero probability of tunneling into the other vacuum.

Fortunately for us, the tunneling probability is proportional to Exp[-1/λ], and  λ gets only slightly negative in the standard model. Thus,  no reason to panic, our vacuum is meta-stable, meaning its average lifetime extends beyond December 2012. Nevertheless, there is something intriguing here. We happen to occupy a very special patch of the standard model parameter space. First of all there's the good old hierarchy problem: the mass term of the Higgs field takes a very special (fine-tuned?) value such that we live extremely close to the boundary between the broken (v > 0) and the unbroken (v=0) phases. Now we realized the potential is even more special: the quartic coupling is such that two vacua coexist, one at low |H| of order TeV and the other at large |H| of order the Planck scale. Moreover, not only λ but also it's beta functions is nearly zero near the Planck scale, meaning that λ evolves very slowly at high scales. Who sets these boundary conditions? Is that yet another remarkable coincidence, or is there a physical reason?  Something to do with quantum gravity? Something to do with inflation? I think it's fair to say that so far nobody has presented a compelling proposal explaining these boundary conditions satisfied by λ.    

Ah, and don't forget the disclaimer:
All this discussion is valid assuming the standard model is the correct theory all the way up to the Planck scale, which is unlikely.