Friday, 23 November 2012

BS and SUSY

  The decay of a neutral Bs meson into a muon pair is a very rare process whose rate in principle could be severely affected by new physics beyond the standard model. We now know it is not: given the rate measured by the LHCb experiment, any new contribution to the decay amplitude has to be smaller than the standard model one. There's a medical discussion going on and on about the interpretation of this result in the context of supersymmetry.  Indeed,  the statements describing the LHCb result as "a blow to supersymmetry" or "putting SUSY into hospital" are silly (if you think it's the most spectacular  change of loyalties since Terminator 2, read on till the end ;-) But what is the true meaning of this result?

To answer this question at a quantitative level it pays to start with a model independent approach (and technical too, to filter the audience ;-)  B-meson decays are low-energy processes which are properly described within a low-energy theory with heavy particles, like W/Z bosons or new physics, integrated out. That is to say, one can think of the Bs→μμ decay as caused by effective 4-fermion operators with 1 b-quark, 1 s-quark, and 2 muons: 

Naively, integrating out a mediator with mass M generates a 4-fermion operator suppressed by M^2. In the standard model, only the first operator is generated  with ML,SM≈17 TeV, dominantly by the diagram with the Z-boson exchange pictured here. That scale is much bigger than the Z mass because the diagram is suppressed by a 1-loop factor, and furthermore it is proportional to the CKM matrix element V_ts whose value is 0.04. The remaining operators do not arise in the SM, in particular there are no scalars that could generate MS or MP (the Higgs boson couples to mass, thus by construction it has no flavor violating couplings to quarks). 

In terms of the coefficients of these operators, the Bs→μμ branching fraction relative to the SM one is given by

LHCb says that this ratio should not be larger than 2 or smaller than 1/3. This leads to model-independent constraints on the mass scales suppressing the 4-fermion operators. And so, the lower bound on ML and MR is about 30 TeV, that is similar in size of the standard model contribution. The bound on the scalar and pseudoscalar operators is much stronger: MS,MP≳150,200 TeV. \begin{digression} The reason is that the contribution of the vector operators to the Bs→μμ  decay is suppressed by the small ratio of muon and Bs masses, which goes under the name of helicity suppression. Bs is spin zero, and a vector particle mediating the decay always couples to 2 muons of the same chirality. In the limit mμ=0,  when chirality=helicity,  the muons spins add up, which forbids the decay by spin conservation \end{digression}.

Consequently, the LHCb result can be interpreted as a constraint  on new physics capable of generating the 4-fermion operators  listed above. For example,  a generic pseudoscalar with order 1 couplings and flavor violating couplings to quarks and leptons must be heavier than about 100 TeV. It may sound surprising that the LHC can probe physics above 100 TeV, even if indirectly. But this is in fact typical for B-physics: observables related to CP violation and mixing of B-mesons are sensitive to similar energy scales  (see e.g Table I of this paper). Notice however that 100 TeV is not a hard bound on new pseudoscalars. If the new physics has a built-in mechanism suppressing the flavor violating couplings then even weak scale masses may be allowed.      

Now, what happens in SUSY? The bitch always comes in package with an extended Higgs sector, and the exchange of the heavier cousins of the Higgs boson can generate the operators MS and MP. However, bounds on the heavy Higgs masses from Bs→μμ will always be much weaker than 100 TeV quoted above. Firstly, the Higgses couple to mass, thus the Yukawa couplings relevant for this decay are much smaller than one. Secondly, the Higgses have flavor conserving couplings at tree-level, and flavor violation is generated only at 1 loop. Finally, models of low-energy SUSY always assume some mechanism to suppress flavor violation (otherwise all hell breaks loose); in typical realizations flavor violating amplitudes will be suppressed by the CKM matrix elements, much as in the standard model.  All in all, SUSY appears less interesting in this context than other new physics models, and SUSY contributions to Bs→μμ are typically smaller than the standard model ones.


But then SUSY has many knobs and buttons.  The one called tanβ -- the ratio of the vacuum values of the two Higgs fields -- is useful here because the Yukawa couplings of the heavy Higgses to down-type quarks and leptons happen to be proportional to tanβ. Some SUSY contributions to the branching fraction are proportional to the 6th power of  tanβ. It is then possible to pump up tanβ such that the SUSY contribution to Bs→μμ exceeds the standard model one and becomes observable.  For  this reason, Bs→μμ  was hailed as a probe of SUSY. But, at the end of the day, the bound from Bs→μμ on the heavy Higgs masses  is relevant only in the specific corner of the parameter space (large tanβ), and even then the SUSY contribution crucially depends on other tunable parameters:  Higgsino and gaugino masses, mass splittings in the squark sector, the size of the A-terms, etc.   This is illustrated by the plot on the right where the bounds (red) change significantly for different assumptions about the μ-term and the sign of the A-term. Thus, the bound may be an issue in some (artificially) constrained SUSY scenarios like mSUGRA, but it can be easily dodged in more the general case.   

To conclude, you should interpret the LHCb measurement of the Bs→μμ branching fraction as a strong bound on theories on new physics coupled to leptons and,  in a flavor violating way, to quarks. In the context of SUSY, however, there are far better reasons to believe her dead (flavor and CP, little hierarchy problem, direct searches). So one should not view Bs→μμ as the SUSY killer,  but as just another handful of earth upon the coffin ;-)

Some pictures borrowed from Mathieu Perrin-Terrin's talk

Wednesday, 14 November 2012

Higgs: what's new

I know, there's already a dozen of nice summaries on blogs (for example here, here, and here) so why do you need another one? Anyway... the new release of LHC Higgs results is the clue of this year's HCP conference (HCP is the acronym for  Human CentiPede). The game is completely different than a few months ago: there's no doubt that a 126 GeV Higgs-like particle is there in the data,  and nobody gives a rat's ass whether the signal significance is 5 or 11 sigma. The relevant question now is whether the observed properties of the new particle match those of the standard model Higgs.  From that point of view, today's update brought some new developments, all of them depressing.


The money plots from ATLAS and CMS summarize it all:

We're seeing the Higgs in more and more channels, and the observed rates are driven, as if by magic, to  the vertical line denoting the standard model rate. 

It came to a point where the most exciting thing about the new Higgs release was what wasn't there :-) It is difficult not to notice that the easy Higgs search channels, h→γγ and ATLAS h→ZZ→4l,    were not updated.  In ATLAS, the reason was the discrepancy between the Higgs masses measured in  those 2 channels: the best fit mass came out 123.5 GeV in the  h→ZZ→4l, and 126.5 GeV in the  h→γγ channel. The difference is larger than the estimated mass resolution, therefore ATLAS decided to postpone the update in order to carefully investigate the problem.  On the other hand in CMS, after unblinding the new analysis in the h→γγ channel, the signal strength went down by more than they were comfortable with; in particular the new results are not very consistent with what was presented on the 4th of July. Most likely, all these analyses will be released before the end of the year, after more cross-checking is done.



Among the things that were there, the biggest news is the h→ττ decay. Last summer there were some hints that the ττ  channel might be suppressed, as the CMS exclusion limit was reaching the standard model rate. It seems that the bug in the code has been corrected:  CMS, and also ATLAS, now observe an excess of events over the non-Higgs backgrounds consistent with what we expect from the standard model Higgs.  The excess is not enough to claim observation of this particular decay, but enough to suppress the hopes that some interesting physics is lurking here. 

Another important update concerns the h→bb decay, for the Higgs produced together with a  W or Z boson. Here, in contrast, earlier  hints from the Tevatron suggested that the rate might be enhanced by a factor of 2 or so. The LHC experiments are now at the point of surpassing the Tevatron sensitivity in that channel, and they don't see any enhancement: CMS observes the rate slightly above the standard model one (though again, the excess is not enough to claim observation), while ATLAS sees a large negative fluctuation. Also, the Tevatron has revised downward the reported signal strength, now that they know it should be smaller.  So, again, it's "move on folks, nothing to see here"...

What does this all mean for new physics? If one goes beyond the standard model, the Higgs couplings to matter can take in principle arbitrary values, and the LHC measurements can be interpreted as constraints on these coupling. As it is difficult to plot a multi-dimensional parameter space, for presentation purposes one makes simplifying assumptions.  One common ansatz is to assume that all tree-level Higgs couplings to gauge bosons get rescaled by a factor cV, and all couplings to fermions get rescaled by an independent factor cf.  The standard model corresponds to the point cf=cV=1. Every Higgs measurement selects a preferred region in the  cV-cf parameter space, and measurements in different channels constrain different combinations of cV and cf.  The plot on the right shows 1-sigma bands corresponding to individual decay channels, and the 68%CL and 99%CL preferred regions after combining all LHC Higgs measurements. At the end of the day,  the standard model agrees well with the data. There is however a lower χ2 minimum in the region of the parameter space where the relative sign between the Higgs couplings to gauge bosons and to fermions  is flipped. The sign does not matter for most of the measurements, except in the h→γγ channel. The reason is that h→γγ is dominated by two  1-loop processes, one with the W boson and one with the top quark in the loop. Flipping the sign changes the interference between these two processes from destructive to constructive, the latter leading to an enhancement of the h→γγ rate in agreement with observations. On the down side, I'm not aware of any model where the flipped sign would come out naturally (and anyway the h→γγ will  go down after CMS updates h→γγ, probably erasing the preference for the non-SM minimum).

Finally,  we learned at the HCP that the LHC is taking precision Higgs measurements to a new level, probing not only the production rates but also more intricate properties of the Higgs signal. In particular,  CMS presented an analysis of the data in the h→ZZ→4l channel that discriminates between a scalar and a pseudoscalar particle. What this really means is that they discriminate between 2 operators allowing a decay of  the Higgs into Z bosons: 

The first operator occurs in the standard model at tree level, and leads to a preference for decays into  longitudinally polarized Z bosons. The other is the lowest order coupling possible for a pseudoscalar, and leads to decays into transversely polarized Z bosons only. By looking at the angular distributions of the leptons from Z decays (a transverse Z prefers to emit leptons along the direction of motion, while a longitudinal Z - perpendicularly to the direction of motion) one can determine the relative amount of transverse and longitudinal Z bosons in the Higgs sample, and thus discriminate between the two operators.  CMS observes a slight 2.5 sigma preference for the standard model operator, which is of course not surprising (it'd be hard to understand why the  h→ZZ rate is so close to the standard model one if the other operator was responsible for the decay). With more data we will obtain more meaningful constraints on the higher dimensional couplings of the Higgs.

To summarize,  many particle theorists were placing their bets that Higgs physics is the most likely place where new physics may show up. Unfortunately, the simplest and most boring version of the Higgs predicted by the standard model is emerging from the LHC data. It may be the right  time to start scanning job ads in condensed matter or neuroscience ;-)

All Higgs parallel session talks are here (the password is given in the dialog box).

Friday, 9 November 2012

Fermi on the Fermi line

The 130 GeV monochromatic gamma-ray emission from the galactic center detected by the Fermi satellite may be a signal of dark matter. Until last week the claim was based on freelance analyses by theorists using publicly available Fermi data. At the symposium last week the Fermi collaboration made the first public statement on the tentative line signal.  Obviously, a word from the collaboration has a larger weight, as they know better the nuts and bolts of the detector. Besides, the latest analysis from Fermi uses reprocessed data with the corrected energy scale and more fancy fitting algorithms, which in principle should give them a better sensitivity to the signal. The outcome is that you can see the glass as half-full or half-empty. On one hand, Fermi confirms the presence of a narrow bump in the gamma-ray spectrum near 130 GeV. On the other hand, certain aspects of the data cast doubt on the dark matter origin of the bump. Here are the most important things that have been said.  

  •  Recall that Fermi's previous line search in 2-years data didn't report any signal.  Actually, neither does the new 4-years one, if Fermi's a-priori optimized search regions are used. In particular, the significance of the bump near 130 GeV in the 12x10 degree box around the galactic center is merely 2.2 sigma.  There is no outright contradiction with the theorist's analyses, as th e latter employ different, more complicated search regions. In fact, if Fermi instead focuses on a smaller 4x4 degree box around the galactic center, they see a signal with  3.35 sigma  local significance (after reprocessing data, the significance would be 4 sigma without reprocessing). This is the first time the Fermi collaboration admits seeing a feature that could possibly be a signal of dark matter annihilation.  
  • Another news is that the 130 GeV line has been upgraded to a 135 GeV line: it turns out that reprocessing the data shifted the position of the bump. That should make little difference to dark matter models explaining the line signal, but in any case you should expect another round of theory papers fitting   the new number ;-) 
  • Unfortunately, Fermi also confirms the presence of a 3 sigma line near 130 GeV in the Earth limb data (where there should be none). Fermi assigns the effect to a 30% dip in detection efficiency in the bins above and below 130 GeV. This dip cannot by itself explain the 135 GeV signal from the galactic center. However, it may be that the line is an unlucky fluctuation on top of the instrumental effect due to the dip.  
  • Fermi points out a few other details that may be worrisome. They say there's some indication that the 135 GeV feature is not as smooth as expected if it were due to dark matter. They find bumps of similar significance at other energies and other places in the sky. Also,  the significance of the 135 GeV signal drops when reprocessed data and more advanced line-fitting techniques are used, while one would expect the opposite if the signal is of physical origin. 
  • A fun fact for dessert. The strongest line signal that Fermi finds is near 5 GeV and has 3.7 sigma local significance (below 3  sigma with the look-elsewhere effect taken into account). 5 GeV dark matter could fit the DAMA and CoGeNT direct detection, if you ignore the limits from the Xenon and CDMS experiments.  Will the 5 GeV line prove as popular with theorists as the 130 GeV one?


So, the line is sitting there in the data, and potential consequences are mind blowing.  However,  after the symposium there are more reasons to be skeptical about the dark matter interpretation. More data and more work from Fermi should clarify the situation. There's also a chance that the HESS telescope (Earth-based gamma-ray observatory) will confirm or refute the signal some time next spring.