Last week CERN enjoyed a visit of James Hartle, the Hartle in the Hartle-Hawking wavefunction. James gave a theory seminar on which i will share some remarks another time. He also gave a general audience colloqium entitled The Future of Gravity. This one i'm reporting in passing, as there is not much to report on. The colloqium was a piece about past and future tests of Einstein's general relativity.

It was a standard Google talk. It seems as if James googled for "test" , "Einstein", "gravity". The results returned by the search engine were spanning the whole spectrum of experimental efforts, from the laboratory and solar system studies of weak-field gravity, to observations of strong-gravity phenomena in astrophysical environments. One hour of pictures&names and little infortmation. Not that it was a total waste of time. The subject itself is fascinating enough, with the tremendous experimental progress going on. However, one would expect more than just hopping from one experiment to another.

Neither video nor transparencies from the CERN colloqium are available. If you're not disencouraged yet, here you can find transparencies from a somewhat similar talk given in Santa Barbara.

## Monday 30 April 2007

## Saturday 21 April 2007

### Caught in a Twister

CERN Cosmo Coffe is a tough venue for speakers. The audience is smart and inquisitive. Discussions are fierce. Not everyone survives...

Last Wednesday, Francesco Sannino was talking about dark matter abundance predicted by a certain technicolor model of his own. Francesco's research is devoted to understanding technicolor - a mechanism of electroweak symmetry breaking by new strong forces. Recently, he has been advocating a particular model that, according to him, is able to pass tight experimental constraints. This model has also a dark matter candidate. In a subsequent paper he discussed the amount of dark matter generated during

the evolution of the universe. This was the topic of the last Cosmo Coffee.

Things looked quite interesting. In the presence of the new strong forces, new particles with masses around 1 TeV appear, in a similar fashion as baryons (like the proton and the neutron) arise in QCD. These new particles are called technibaryons. There is also a global U(1) symmetry under which technibaryons are charged. The new global symmetry leads to a new conserved quantum number which, in analogy to the baryon number in QCD, is called the technibaryon number. The lightest technibaryon is stable and can be promoted to a dark matter candidate. Furthermore, the technibaryon number is violated by quantum anomalies; again, just like the baryon number in QCD. When the universe was very hot, the processes changing baryons into technibaryons (and vice-versa) were operative. The model thus relates the quanity of dark matter and visible matter. Francesco's claim was that the observed ratio of 5 between the dark and visible matter abundances could be naturally obtained.

Alas, the audience didn't buy this claim. Francesco's rather involved computation was questioned. Instead, a simple argument based on a conservation of a certain combination of baryon and technibaryon numbers was presented. This argument suggests that the today number densities of baryons and technibaryons predicted by Francesco's model are comparable. Since the technibaryons are about 1000 times heavier than the proton, the predicted dark matter density is too big by a factor of hundred. Bang, bang, you're dead...

The coming Thursday Francesco will give another talk on the related subject. We will learn if things can be saved. I'll let you know.

Last Wednesday, Francesco Sannino was talking about dark matter abundance predicted by a certain technicolor model of his own. Francesco's research is devoted to understanding technicolor - a mechanism of electroweak symmetry breaking by new strong forces. Recently, he has been advocating a particular model that, according to him, is able to pass tight experimental constraints. This model has also a dark matter candidate. In a subsequent paper he discussed the amount of dark matter generated during

the evolution of the universe. This was the topic of the last Cosmo Coffee.

Things looked quite interesting. In the presence of the new strong forces, new particles with masses around 1 TeV appear, in a similar fashion as baryons (like the proton and the neutron) arise in QCD. These new particles are called technibaryons. There is also a global U(1) symmetry under which technibaryons are charged. The new global symmetry leads to a new conserved quantum number which, in analogy to the baryon number in QCD, is called the technibaryon number. The lightest technibaryon is stable and can be promoted to a dark matter candidate. Furthermore, the technibaryon number is violated by quantum anomalies; again, just like the baryon number in QCD. When the universe was very hot, the processes changing baryons into technibaryons (and vice-versa) were operative. The model thus relates the quanity of dark matter and visible matter. Francesco's claim was that the observed ratio of 5 between the dark and visible matter abundances could be naturally obtained.

Alas, the audience didn't buy this claim. Francesco's rather involved computation was questioned. Instead, a simple argument based on a conservation of a certain combination of baryon and technibaryon numbers was presented. This argument suggests that the today number densities of baryons and technibaryons predicted by Francesco's model are comparable. Since the technibaryons are about 1000 times heavier than the proton, the predicted dark matter density is too big by a factor of hundred. Bang, bang, you're dead...

The coming Thursday Francesco will give another talk on the related subject. We will learn if things can be saved. I'll let you know.

## Thursday 19 April 2007

### After MiniBooNE

The MiniBooNe wave swept through the blogosphere a week ago, right after first results were published by the collaboration. Here at CERN, however, we take our time ;-) Only today, the local neutrino gurus organized a discussion, which i am gladly reporting. The delay is convienent for me, as i don't need to explain the basics. The physics of neutrino oscillations and the kitchen of the MiniBooNE experiment are explained at length here on Cosmic Variance and here on Tommaso's blog. I'm going to concentrate on the most interesting thing, which is theory. On the impact of the MiniBooNE results on the theory of neutrino masses, to be precise.

The MiniBooNE experiment was looking for an appearence of electron neutrinos in a beam of muon neutrinos. The main objective was to confirm or exclude the signal of the earlier experiment called LSND, which had claimed to observe oscillations of muon anti-neutrinos into electron anti-neutrinos. If the LSND signal is real, we need at least one new neutrino species in addition to the Standard Model particles. MiniBooNE did not find any oscillation signal for neutrino energies larger than 475 MeV. The negative result means that the simple two-neutrino-oscillation interpretation of the LSND data is no longer available. As seen on the plot above, MiniBooNE finds some excess of electron neutrinos at smaller energies. However, in this energy range the expected background is large. For the moment, it is not clear if the excess is due to new physics or simply due to an underestimation of the background.

Thomas Schwetz was discussing which theoretical scenarios are able to accommodate the LSND and MiniBooNE data, together with the abundant data from other thousand neutrino experiments. His message is that it is tough. In fact, it was tough even before MiniBoone. Explaining the LSND signal requires introducing at least one sterile neutrino (sterile, means not coupled to the Z boson, unlike the three Standard Model active neutrinos). The problem is that the mixing angle between sterile and active neutrinos required to fit the LSND signal must be rather large. This is at odds with the negative results of certain experiments like CHOOZ and BUGEY, which searched for neutrino disappearance.

Thomas presented his preliminary analysis of the situation after MiniBooNE. Fits to all neutrino data in 3 active + 1 sterile scenarios become even worse than before. The fits in 3 + 2 scenarios do not change much. If one allows for CP violation, the \chi^2 is around 100 (with about 100 degrees of freedom). According to Thomas, this is discouraging. According to astrophysicists, this is an indication for 3+2 scenarios ;-) What is strange, nobody's willing to plunge into 3 + 3 scenarios. Chicken?

There are more exotic models proposed to accommodate the LSND results. Thomas mentioned three:

- Sterile neutrino with CPT violation, proposed by Vernon Barger & Co. The LSND signal could be more easily accommodated if neutrinos and anti-neutrinos had different masses. This violates CPT, the hallmark of local relativistic quantum theories. It seems that this scenario is not much affected by MiniBooNE, who studied neutrinos rather than anti-neutrinos.
- Decaying sterile neutrinos, proposed by Thomas Schwetz in person. Here, the LSND signal is a muon anti-neutrino oscillating into a sterile one, the latter decaying to an electron anti-neutrino and an exotic scalar. This model predicts a signal in MiniBooNE which is not there. The model can be saved, however, by introducing another sterile neutrino and CP violation in the decays.
- Shortcuts in extra dimensions, proposed by Heinrich Paes & Co. Here, the sterile neutrino propagates in extra dimensions and travels a different path than the active ones, which changes the oscillation pattern. By shaping the extra dimensions properly, the model can accommodate the MiniBooNE excess at lower energies as well as the suppression of oscillations at higher energies. My impression is that, with a little bit more work, the model could accommodate Harry Potter, too.

If you're interested in my opinion, I think the truth is boring. The LSND signal is fake, while the miniBooNE excess will go away after re-examining the background. The theory of neutrino oscillations will remain the theory of a 3x3 matrix.

Everything about neutrinos can be found on this page, updated in real-time by Allessandro Strumia.

## Wednesday 4 April 2007

### Strongly Interacting Light Higgs

Three weeks ago, Gian Giudice & friends came up with a paper that deserves attention. For two reasons. One, that it was conceived in the murky corridors of CERN TH. Two, that it deals with an interesting though somewhat neglected subject: the impact of strong dynamics at higher energies on higgs physics at the LHC.

The title is Strongly Interacting Light Higgs. This longish name refers to a scenario where strongly coupled dynamics at a few TeV gives rise to a composite higgs state with mass of order 100 GeV. The paper does not present us with yet another bright model of electroweak breaking. Rather, the purpose is to put several existing approaches into a wider perspective. The models falling into this class include composite pseudo-goldstone higgs, little higgs and 5D holographic higgs. The paper proposes a model independent parametrization and rough estimates of the expected experimental signals.

The Strongly Interacting Light Higgs class has the following features

Gian&others discuss several imprints of strong dynamics at higher scales on the low-energy higgs physics:

One sad conclusion of the paper is that sensitivity of the LHC to the discussed effects is not astounding. The deviations of higgs physics from the Standard Model could be measured for v^2/f^2 of order 0.5. The ILC would have much better sensitivity, up to v^2/f^2 of order 0.01. The problem is that finding just the higgs at the LHC would likely postpone the high energy physics program for many years (similar depressive view was recently expressed by Burton Richter). If you prefer to think positive, you may get another message from the paper: stop plotting the mSugra parameter space, think how to measure WW scattering at the LHC.

The title is Strongly Interacting Light Higgs. This longish name refers to a scenario where strongly coupled dynamics at a few TeV gives rise to a composite higgs state with mass of order 100 GeV. The paper does not present us with yet another bright model of electroweak breaking. Rather, the purpose is to put several existing approaches into a wider perspective. The models falling into this class include composite pseudo-goldstone higgs, little higgs and 5D holographic higgs. The paper proposes a model independent parametrization and rough estimates of the expected experimental signals.

The Strongly Interacting Light Higgs class has the following features

- At the weak scale v = 246 GeV, the available particles are those of the Standard Model with one higgs doublet. Electroweak symmetry is broken by the higgs who acquires a vacuum expectation value. The associated higgs boson is not much heavier than 100 GeV.
- Higgs is a pseudo-goldstone boson, similarly as pions in QCD. Much as the pions, it comes with a new scale f , which controls non-linearities of the higgs self-interactions.
- There is yet another scale mR. It is set by the masses of the resonaances associated with the strong dynamics that gave birth to the pseudo-goldstone higgs. mR may be smaller than 4\pi f - the naive cut-off of the low energy theory.

Gian&others discuss several imprints of strong dynamics at higher scales on the low-energy higgs physics:

- Modified higgs couplings to fermions and gauge bosons. Due to its goldstone nature, the higgs couplings are modified by a factor of order v^2/f^2. Both, the total higgs decay rate and various branching ratios are affected. In the parametrization proposed by the authors, the leading deviations from the Standard Model can be described by just two parameters in the low-energy effective lagrangian.
- Strong WW and WZ scattering. Scattering amplitudes of longitudinally polarized electroweak gauge bosons grow with energy as E^2/v^2. In the Standard Model this dangerous behaviour is tamed by the higgs boson exchange. However, for a pseudo-goldstone higgs, the relevant higgs couplings are modified. In consequence, the amplitude grows quadratically with energy, even above the higgs mass, until it is tamed by the resonaances at the scale mR.
- Strong higgs production. In this scenario, higgs couplings are directly related to those of the longitudinally polarized gauge bosons. Therefore, the amplitude for the higgs pair production, e.g. ZZ->hh, also grows quadratically with energy.

One sad conclusion of the paper is that sensitivity of the LHC to the discussed effects is not astounding. The deviations of higgs physics from the Standard Model could be measured for v^2/f^2 of order 0.5. The ILC would have much better sensitivity, up to v^2/f^2 of order 0.01. The problem is that finding just the higgs at the LHC would likely postpone the high energy physics program for many years (similar depressive view was recently expressed by Burton Richter). If you prefer to think positive, you may get another message from the paper: stop plotting the mSugra parameter space, think how to measure WW scattering at the LHC.

## Monday 2 April 2007

### Surprise from Martin Schmaltz

We are done with April Fools hoaxes for this year. It is high time to resume boring theory blogging. I have a few delayed reports in store...

Martin Schmaltz flashed through CERN two weeks ago. Whenever he's around, there is action. The other day, when he talked about Little Higgs at CERN, somebody from the audience called him a little taliban. No wonder that, this time, nobody at CERN TH gave him a seminar slot. Luckily enough, he was adopted by unsuspecting experimentalists from the EP seminar.

Martin came with a talk MSSM scalar masses and hidden sector interactions. Although this title seems well-behaved, a look at the abstract promises a lot. Martin wrote ...prediction [concerning soft scalar masses] from commonly used spectrum codes (such as ISAJET, SPHENO, SOFTSUSY, SUSPECT) may be completely wrong... He bothered to enumerate all the popular codes in order to annoy a maximum number of people.

If the Minimal Supersymmetric Standard Model (MSSM) is discovered at the LHC and precisely measured at the ILC, then we will learn about the superparticle masses. We expect that those masses are determined by a high energy theory which is more fundametal than the MSSM. One possibility is that the high energy theory is a string theory. Thus, by measuring the MSSM parameters we could get a glimpse of the fundamental theory. If we are extremely lucky, the LHC and ILC measurements could point to some specific realization of string theory.

The relation between masses measured in low-energy experiments and the fundamental parameters is not straightforward. In colliders, we would measure the MSSM parameters at the 1 TeV scale. On the other hand, the fundamental theory directly predicts the mass parameters at the high scale, for example at the GUT scale 10^16 GeV. The two sets of parameters are related by renormalization group equations. Relating the low- and high-energy parameters is theoretically so important, that the task has been automatized in the publicly available codes mentioned before. The nice thing is that, within the standard MSSM paradigm, the renormalization group equations depend only on the quantities that can be measured at low energies. Therefore unambiguous predictions of high-energy parameters from low-energy inputs can be made. Or so it seemed.

In his recent paper, Martin pointed out that the relation between the high- and low-energy parameters can be obscured by the effects related to supersymmetry breaking. He claims that the renormalization group equations for the MSSM scalar masses contain a potentially large contribution that has been overlooked so far (the gaugino masses are not affected). This contribution comes from self-interactions in the hidden sector that is responsible for spontaneous supersymmetry. It has been assumed so far that this effect can change only the overall normalization of the scalar masses. Martin showed that this assumption is in general wrong and gave examples where the new effects dominate the standard MSSM contributions. The masses of the hidden sector particles are typically of order 10^11 GeV, so we will have no experimental access to the hidden sector parameters. Therefore collider measurements of scalar superparticle masses may reveal nothing about the fundamental theory.

According to Martin, a string theory fan after listening to his talk would look like this:

This is an artistic vision. However, there is still a large class of string models where the hidden sector renormalization is not dangerous. This is the case when supersymmetry breaking is mediated by light moduli, which includes the beloved KKLT. The moduli interact weakly, with gravitational strength, therefore they cannot mess up the MSSM renormalization group equations. On the other hand, in the case of dynamical supersymmetry breaking a-la Seiberg, the hidden sector interacts strongly and large effects are difficult to avoid.

For a comic book version of the story, have a look at the transparencies. For technical details, you should consult the paper.

Martin Schmaltz flashed through CERN two weeks ago. Whenever he's around, there is action. The other day, when he talked about Little Higgs at CERN, somebody from the audience called him a little taliban. No wonder that, this time, nobody at CERN TH gave him a seminar slot. Luckily enough, he was adopted by unsuspecting experimentalists from the EP seminar.

Martin came with a talk MSSM scalar masses and hidden sector interactions. Although this title seems well-behaved, a look at the abstract promises a lot. Martin wrote ...prediction [concerning soft scalar masses] from commonly used spectrum codes (such as ISAJET, SPHENO, SOFTSUSY, SUSPECT) may be completely wrong... He bothered to enumerate all the popular codes in order to annoy a maximum number of people.

If the Minimal Supersymmetric Standard Model (MSSM) is discovered at the LHC and precisely measured at the ILC, then we will learn about the superparticle masses. We expect that those masses are determined by a high energy theory which is more fundametal than the MSSM. One possibility is that the high energy theory is a string theory. Thus, by measuring the MSSM parameters we could get a glimpse of the fundamental theory. If we are extremely lucky, the LHC and ILC measurements could point to some specific realization of string theory.

The relation between masses measured in low-energy experiments and the fundamental parameters is not straightforward. In colliders, we would measure the MSSM parameters at the 1 TeV scale. On the other hand, the fundamental theory directly predicts the mass parameters at the high scale, for example at the GUT scale 10^16 GeV. The two sets of parameters are related by renormalization group equations. Relating the low- and high-energy parameters is theoretically so important, that the task has been automatized in the publicly available codes mentioned before. The nice thing is that, within the standard MSSM paradigm, the renormalization group equations depend only on the quantities that can be measured at low energies. Therefore unambiguous predictions of high-energy parameters from low-energy inputs can be made. Or so it seemed.

In his recent paper, Martin pointed out that the relation between the high- and low-energy parameters can be obscured by the effects related to supersymmetry breaking. He claims that the renormalization group equations for the MSSM scalar masses contain a potentially large contribution that has been overlooked so far (the gaugino masses are not affected). This contribution comes from self-interactions in the hidden sector that is responsible for spontaneous supersymmetry. It has been assumed so far that this effect can change only the overall normalization of the scalar masses. Martin showed that this assumption is in general wrong and gave examples where the new effects dominate the standard MSSM contributions. The masses of the hidden sector particles are typically of order 10^11 GeV, so we will have no experimental access to the hidden sector parameters. Therefore collider measurements of scalar superparticle masses may reveal nothing about the fundamental theory.

According to Martin, a string theory fan after listening to his talk would look like this:

This is an artistic vision. However, there is still a large class of string models where the hidden sector renormalization is not dangerous. This is the case when supersymmetry breaking is mediated by light moduli, which includes the beloved KKLT. The moduli interact weakly, with gravitational strength, therefore they cannot mess up the MSSM renormalization group equations. On the other hand, in the case of dynamical supersymmetry breaking a-la Seiberg, the hidden sector interacts strongly and large effects are difficult to avoid.

For a comic book version of the story, have a look at the transparencies. For technical details, you should consult the paper.

## Sunday 1 April 2007

### April Fools'07: Aymar resigns

The problems with the LHC inner-triplet magnets turned out to be more serious than anyone expected. As reported here, the start of the LHC will be delayed by several years. Following these reports, the CERN Council had a special meeting today, during which Director General Robert Aymar resigned from his post. I assume full responsibility for what has happened, he said. The Council appointed a theorist John Ellis, as next Director General. Experimentalists have proven incapable of managing such a complicated and sensitive project as the LHC, Ellis said in an interview to the press. It is now time for a change.

Subscribe to:
Posts (Atom)