Saturday, 18 June 2016

Game of Thrones: 750 GeV edition

The 750 GeV diphoton resonance has made a big impact on theoretical particle physics. The number of papers on the topic is already legendary, and they keep coming at the rate of order 10 per week. Given that the Backović model is falsified, there's no longer a theoretical upper limit.  Does this mean we are not dealing with the classical ambulance chasing scenario? The answer may be known in the next days.

So who's winning this race?  What kind of question is that, you may shout, of course it's Strumia! And you would be wrong, independently of the metric. The contest is much more fierce than one might expect:  it takes 8 papers on the topic to win, and 7 papers to even get on the podium.  Among the 3 authors with 7 papers the final classification is decided by trial by combat the citation count.  The result is (drums):

Citations, tja...   Social dynamics of our community encourages referencing all previous work on the topic, rather than just the relevant ones, which in this particular case triggered a period of inflation. One day soon citation numbers will mean as much as authorship in experimental particle physics. But for now the size of the h-factor is still an important measure of virility for theorists. If the citation count rather the number of papers is the main criterion, the iron throne is taken by a Targaryen contender (trumpets):

This explains why the resonance is usually denoted by the letter S.

Congratulations to all the winners.  For all the rest, wish you more luck and persistence in the next edition,  provided it will take place.

My scripts are not perfect (in previous versions I missed crucial contenders, as pointed out in the comments), so let me know in case I missed your papers or miscalculated citations. 

Friday, 10 June 2016

Black hole dark matter

The idea that dark matter is made of primordial black holes is very old but has always been in the backwater of particle physics. The WIMP or asymmetric dark matter paradigms are preferred for several reasons such as calculability, observational opportunities, and a more direct connection to cherished theories beyond the Standard Model. But in the recent months there has been more interest, triggered in part by the LIGO observations of black hole binary mergers. In the first observed event, the mass of each of the black holes was estimated at around 30 solar masses. While such a system may well be of boring astrophysical origin, it is somewhat unexpected because typical black holes we come across in everyday life are either a bit smaller (around one solar mass) or much larger (supermassive black hole in the galactic center). On the other hand, if the dark matter halo were made of black holes, scattering processes would sometimes create short-lived binary systems. Assuming a significant fraction of dark matter in the universe is made of primordial black holes, this paper estimated that the rate of merger processes is in the right ballpark to explain the LIGO events.

Primordial black holes can form from large density fluctuations in the early universe. On the largest observable scales the universe is incredibly homogenous, as witnessed by the uniform temperature of the Cosmic Microwave Background over the entire sky. However on smaller scales the primordial inhomogeneities could be much larger without contradicting observations.  From the fundamental point of view, large density fluctuations may be generated by several distinct mechanism, for example during the final stages of inflation in the waterfall phase in the hybrid inflation scenario. While it is rather generic that this or similar process may seed black hole formation in the radiation-dominated era, severe fine-tuning is required to produce the right amount of black holes and ensure that the resulting universe resembles the one we know.

All in all, it's fair to say that the scenario where all or a significant fraction of  dark matter  is made of primordial black holes is not completely absurd. Moreover, one typically expects the masses to span a fairly narrow range. Could it be that the LIGO events is the first indirect detection of dark matter made of O(10)-solar-mass black holes? One problem with this scenario is that it is excluded, as can be seen in the plot.  Black holes sloshing through the early dense universe accrete the surrounding matter and produce X-rays which could ionize atoms and disrupt the Cosmic Microwave Background. In the 10-100 solar mass range relevant for LIGO this effect currently gives the strongest constraint on primordial black holes: according to this paper they are allowed to constitute  not more than 0.01% of the total dark matter abundance. In astrophysics, however, not only signals but also constraints should be taken with a grain of salt.  In this particular case, the word in town is that the derivation contains a numerical error and that the corrected limit is 2 orders of magnitude less severe than what's shown in the plot. Moreover, this limit strongly depends on the model of accretion, and more favorable assumptions may buy another order of magnitude or two. All in all, the possibility of dark matter made of  primordial black hole in the 10-100 solar mass range should not be completely discarded yet. Another possibility is that black holes make only a small fraction of dark matter, but the merger rate is faster, closer to the estimate of this paper.

Assuming this is the true scenario, how will we know? Direct detection of black holes is discouraged, while the usual cosmic ray signals are absent. Instead, in most of the mass range, the best probes of primordial black holes are various lensing observations. For LIGO black holes, progress may be made via observations of fast radio bursts. These are strong radio signals of (probably) extragalactic origin and millisecond duration. The radio signal passing near a O(10)-solar-mass black hole could be strongly lensed, leading to repeated signals detected on Earth with an observable time delay. In the near future we should observe hundreds of such repeated bursts, or obtain new strong constraints on primordial black holes in the interesting mass ballpark. Gravitational wave astronomy may offer another way.  When more statistics is accumulated, we will be able to say something about the spatial distributions of the merger events. Primordial black holes should be distributed like dark matter halos, whereas astrophysical black holes should be correlated with luminous galaxies. Also, the typical eccentricity of the astrophysical black hole binaries should be different.  With some luck, the primordial black hole dark matter scenario may be vindicated or robustly excluded  in the near future.

See also these slides for more details. 

Friday, 27 May 2016

CMS: Higgs to mu tau is going away

One interesting anomaly in the LHC run-1 was a hint of Higgs boson decays to a muon and a tau lepton. Such process is forbidden in the Standard Model by the conservation of  muon and tau lepton numbers. Neutrino masses violate individual lepton numbers, but their effect is far too small to affect the Higgs decays in practice. On the other hand, new particles do not have to respect global symmetries of the Standard Model, and they could induce lepton flavor violating Higgs decays at an observable level. Surprisingly, CMS found a small excess in the Higgs to tau mu search in their 8 TeV data, with the measured branching fraction Br(h→τμ)=(0.84±0.37)%.  The analogous measurement in ATLAS is 1 sigma above the background-only hypothesis, Br(h→τμ)=(0.53±0.51)%. Together this merely corresponds to a 2.5 sigma excess, so it's not too exciting in itself. However, taken together with the B-meson anomalies in LHCb, it has raised hopes for lepton flavor violating new physics just around the corner.  For this reason, the CMS excess inspired a few dozen of theory papers, with Z' bosons, leptoquarks, and additional Higgs doublets pointed out as possible culprits.

Alas, the wind is changing. CMS made a search for h→τμ in their small stash of 13 TeV data collected in 2015. This time they were hit by a negative background fluctuation, and they found Br(h→τμ)=(-0.76±0.81)%. The accuracy of the new measurement is worse than that in run-1, but nevertheless it lowers the combined significance of the excess below 2 sigma. Statistically speaking, the situation hasn't changed much,  but psychologically this is very discouraging. A true signal is expected to grow when more data is added, and when it's the other way around it's usually a sign that we are dealing with a statistical fluctuation...

So, if you have a cool model explaining the h→τμ  excess be sure to post it on arXiv before more run-2 data is analyzed ;)

Saturday, 14 May 2016

Plot for Weekend: new limits on neutrino masses

This weekend's plot shows the new limits on neutrino masses from the KamLAND-Zen experiment:

KamLAND-Zen is a group of buddhist monks studying a balloon filled with the xenon isotope Xe136. That isotope has a very long lifetime, of order 10^21 years, and undergoes the lepton-number-conserving double beta decay Xe136 → Ba136 2e- 2νbar. What the monks hope to observe is the lepton violating neutrinoless double beta decay Xe136 → Ba136+2e, which would show as a peak in the invariant mass distribution of the electron pairs near 2.5 MeV. No such signal has been observed, which sets the limit on the half-life for this decay at T>1.1*10^26 years.

The neutrinoless decay is predicted to occur if neutrino masses are of Majorana type, and the rate can be characterized by the effective mass Majorana mββ (y-axis in the plot). That parameter is a function of the masses and mixing angles of the neutrinos. In particular it depends on the mass of the lightest neutrino (x-axis in the plot) which is currently unknown. Neutrino oscillations experiments have precisely measured the mass^2 differences between neutrinos, which  are roughly (0.05 eV)^2 and (0.01 eV)^2. But oscillations are not sensitive to the absolute mass scale; in particular, the lightest neutrino may well be massless for all we know.  If the heaviest neutrino has a small electron flavor component, then we expect that the mββ parameter is below 0.01 eV.  This so-called normal hierarchy case is shown as the red region in the plot, and is clearly out of experimental reach at the moment. On the other hand, in the inverted hierarchy scenario (green region in the plot), it is the two heaviest neutrinos that have a significant electron component. In this case,  the effective Majorana mass mββ is around 0.05 eV.  Finally, there is also the degenerate scenario (funnel region in the plot) where all 3 neutrinos have very similar masses with small splittings, however this scenario is now strongly disfavored by cosmological limits on the sum of the neutrino masses (e.g. the Planck limit Σmν < 0.16 eV).

As can be seen in the plot, the results from KamLAND-Zen, when translated into limits on the effective Majorana mass, almost touch the inverted hierarchy region. The strength of this limit depends on some poorly known nuclear matrix elements (hence the width of the blue band). But even in the least favorable scenario future, more sensitive experiments should be able to probe that region. Thus, there is a hope  that within the next few years we may prove the Majorana nature of neutrinos, or at least disfavor the inverted hierarchy scenario.

Monday, 9 May 2016

Off we go

The LHC is back in action since last weekend, again colliding protons with 13 TeV energy. The weasels' conspiracy was foiled, and the perpetrators were exemplarily electrocuted. PhD students have been deployed around the LHC perimeter to counter any further sabotage attempts (stoats are known to have been in league with weasels in the past). The period that begins now may prove to be the most exciting time for particle physics in this century.  Or the most disappointing.

The beam intensity is still a factor of 10 below the nominal one, so  the harvest of last weekend is meager 40 inverse picobarns. But the number of proton bunches in the beam is quickly increasing, and once it reaches O(2000), the data will stream at a rate of a femtobarn per week or more. For the nearest future, the plan is to have a few inverse femtobarns on tape by mid-July, which would roughly double the current 13 TeV dataset. The first analyses of this chunk of data  should be presented around the time of the  ICHEP conference in early August. At that point we will know whether the 750 GeV particle is real. Celebrations will begin if the significance of the diphoton peak increases after adding the new data, even if the statistics is not enough to officially announce  a discovery. In the best of all worlds, we may also get a hint of a matching 750 GeV peak in another decay channel (ZZ, Z-photon, dilepton, t-tbar,...) which would help focus our model building. On the other hand, if the significance of the diphoton peak drops in August, there will be a massive hangover...

By the end of October, when the 2016 proton collisions are scheduled to end, the LHC hopes to collect some 20 inverse femtobarns of data. This should already give us a rough feeling of new physics within the reach of the LHC. If a hint of another resonance is seen at that point, one will surely be able to confirm or refute it with the data collected in the following years.  If nothing is seen... then you should start telling yourself that condensed matter physics is also sort of fundamental,  or that systematic uncertainties in astrophysics are not so bad after all...  In any scenario, by December, when first analyses of the full  2016 dataset will be released,  we will know infinitely more than we do today.

So fasten your seat belts and get ready for a (hopefully) bumpy ride. Serious rumors should start showing up on blogs and twitter starting from July.

Friday, 1 April 2016

April Fools' 16: Was LIGO a hack?


This post is an April Fools' joke. LIGO's gravitational waves are for real. At least I hope so ;) 

We have had recently a few scientific embarrassments, where a big discovery announced with great fanfares was subsequently overturned by new evidence.  We still remember OPERA's faster than light neutrinos which turned out to be a loose cable, or BICEP's gravitational waves from inflation, which turned out to be galactic dust emission... It seems that another such embarrassment is coming our way: the recent LIGO's discovery of gravitational waves emitted in a black hole merger may share a similar fate. There are reasons to believe that the experiment was hacked, and the signal was injected by a prankster.

From the beginning, one reason to be skeptical about LIGO's discovery was that the signal  seemed too beautiful to be true. Indeed, the experimental curve looked as if taken out of a textbook on general relativity, with a clearly visible chirp signal from the inspiral phase, followed by a ringdown signal when the merged black hole relaxes to the Kerr state. The reason may be that it *is* taken out of a  textbook. This is at least what is strongly suggested by recent developments.

On EvilZone, a well-known hacker's forum, a hacker using a nickname Madhatter was boasting that it was possible to tamper with scientific instruments, including the LHC, the Fermi satellite, and the LIGO interferometer.  When challenged, he or she uploaded a piece of code that allows one to access LIGO computers. Apparently, the hacker took advantage the same backdoor that allows the selected members of the LIGO team to inject a fake signal in order to test the analysis chain.  This was brought to attention of the collaboration members, who  decided to test the code. To everyone's bewilderment, the effect was to reproduce exactly the same signal in the LIGO apparatus as the one observed in September last year!

Even though the traces of a hack cannot be discovered, there is little doubt now that there was a foul play involved. It is not clear what was the motif of the hacker: was it just a prank, or maybe an elaborate plan to discredit the scientists. What is even more worrying is that the same thing could happen in other experiments. The rumor is that the ATLAS and CMS collaborations are already checking whether the 750 GeV diphoton resonance signal could also be injected by a hacker.

Thursday, 17 March 2016

Diphoton update

Today at the Moriond conference ATLAS and CMS updated their diphoton resonance searches. There's been a rumor of an ATLAS analysis with looser cuts on the photons where the significance of the 750 GeV excess grows to a whopping 4.7 sigma. The rumor had it that the this analysis would be made public today, so the expectations were high. However, the loose-cuts analysis was not approved in time by the collaboration, and the fireworks display was cancelled.  In any case,  there was some good news today, and some useful info for model builders was provided.











Let's start with ATLAS. For the 13 TeV results, they now have two analyses: one called spin-0 and one called spin-2. Naively, the cuts in the latter are not optimized not for a spin-2 resonance but rather for a high-mass resonance  (where there's currently no significant excess), so the spin-2 label should not be treated too seriously in this case. Both analyses show a similar excess at 750 GeV: 3.9 and 3.6 sigma respectively for a wide resonance. Moreover, ATLAS provides additional information about the diphoton events, such as the angular distribution of the photons, the number of accompanying jets, the amount of missing energy, etc. This may be very useful for theorists entertaining less trivial models, for example when the 750 GeV resonance is produced  from a decay of a heavier parent particle. Finally, ATLAS shows a re-analysis of the diphoton events collected at 8 TeV center-of-energy of the LHC. The former run-1 analysis was a bit sloppy in the interesting mass range; for example, no limits at all were given for a 750 GeV scalar hypothesis.  Now the run-1 data have been cleaned up and analyzed using the same methods as in run-2. Excitingly, there's a 2 sigma excess in the spin-0 analysis in run-1, roughly compatible with what one would expect given the observed run-2 excess!   No significant excess is seen for the spin-2 analysis, and the tension between the run-1 and run-2 data is quite severe in this case. Unfortunately, ATLAS does not quote the combined significance and the best fit cross section for the 750 GeV resonance.

For CMS, the big news is that the amount of 13 TeV data at their disposal has increased by 20%. Using MacGyver skills, they managed to make sense of the chunk of data collected when the CMS magnet was off due to a technical problem. Apparently it was worth it, as new diphoton events have been found in the 750 GeV ballpark. Thanks to that, and a better calibration,  the significance of the diphoton excess in run-2  actually increases up to 2.9 sigma!  Furthermore, much like ATLAS, CMS updated their run-1 diphoton analyses and combined them with the run-2 ones.  Again, the combination increases the significance of the 750 GeV excess. The combined significance quoted by CMS is 3.4 sigma,  similar for spin-0 and spin-2 analyses. Unlike in ATLAS, the best fit is for a narrow resonance, which is the more preferred option from the theoretical point of view.

In summary, the diphoton excess survived the first test.  After adding more data and improving the analysis techniques the significance slightly increases rather than decreases, as expected for a real particle.  The signal is now a bit more solid: both experiments have a similar amount of diphoton data and they both claim a similar significance of the  750 GeV bump.  It may be a good moment to rename the ATLAS diphoton excess as the LHC diphoton excess :)  So far, the story of 2012 is repeating itself: the initial hints of a new resonance solidify into a consistent picture. Are we going to have another huge discovery this summer?