## Thursday, 22 February 2007

### Flavor and Strong Dynamics

This week i cannot report on any of the regular CERN seminars (meaning, i understood nothing or didn't even dare to walk in). Salvation came from the phenomenology journal club which hosted a short, informal talk by Roberto Contino. Roberto was talking about partial compositeness, reviewing a partly forgotten work on fermion masses in technicolor.

The common lore about technicolor is that it faces two serious problems. One is the difficulty to comply with the electroweak precision tests, in particular with the notorious S parameter. The other is the flavour problem: it is tough to generate the observed fermion mass pattern without producing excessive flavour-changing neutral currents.

Typically, technicolor models generate the fermion masses as follows. First, at some high scale$\Lambda_0$, one introduces a four-fermion operator $\Lambda_0^{-2}ff\chi\chi$ that marries two standard model fermions and two technifermions. The technifermions condense, breaking the electroweak symmetry and giving mass to the W and Z bosons. When this happens, thanks to four-fermion operators like the one above we also get mass terms for the standard model fermions. Parametrically, the fermion masses are given by
$m_{f}\sim\Lambda_{TC} \left ( \frac{ \Lambda_{TC} }{ \Lambda_{0} }\right )^{d-1}$
where $\Lambda_{TC}$ is the technicolor scale of order TeV and d is the dimension of the technifermion bilinear. The classical dimensions is of course d=3 but in a strongly interacting theory renormalization effects may lead to a different, anomalous dimension. The problem is that in calculable setups d >= 2. This leads to an unpleasent tension. On one hand, to obtain the large top quark mass we need $\Lambda_0$ to be rather close to $\Lambda_{TC}$ . On the other hand we would like $\Lambda_0$ to be as high as possible because in generic technicolor models we also generate unwelcome operators with 4 standard model fermions: $\Lambda_0^{-2}ffff$. If $\Lambda_0$ is too low, this leads to excessive flavor violation that is inconsistent with, for example, the kaon mixing experiments.

Though these problems could be overcome by labourious model-building, there exists in fact a simple solution proposed long ago by David B. Kaplan. All the problems mentioned above just disappear without a trace when the standard model fermions couple linearly to technicolor operators: $\lambda\,f{\cal O}$. The fermion masses are now set by the anomalous dimension $\gamma$ of the coupling $\lambda$. If $\gamma$ is postive, the coupling gets suppressed at low energies and one gets
$m_{f}\sim\Lambda_{TC}\left(\frac{\Lambda_{TC}}{\Lambda_{0} }\right )^{2\gamma}$
which is appropriate for light fermions of the first and second generation. When $\gamma$ is negative, the coupling goes to an IR fixed point and one finds
$m_{f}\sim\Lambda_{TC}$
which is appropriate for the top quark. In this scenario, $\Lambda_0$ can be even as high as the Planck scale. No flavor problem! It turns out that technicolor has "only" one serious problem, that with the electroweak precision tests.

One practical consequence of this scenario is that the standard model fermions mix with composite states from the technicolor sector, hence the name partial compositeness. This could show up as deviations of the fermionic interactions from the standard model predictions. For example, if the b quark has sizable admixture of composite states, the Z->bb branching ratio will be modified. In fact, the mixing with the technicolor sector is expected to be strongest for heavy fermions. Thus, the top quark should be mostly composite. Recall that top couplings have been very poorly measured so far...

Another nice thing about this mechanism is that it can be trivially implemented in 5D holographic models. This is the connection to Roberto's present research. But that's a longer story. If you want to know more about it, i recommend to start with a short review article by Roberto himself.

PS. As you may have noticed, this post contains a number of equations. Equations are an efficient tool to reduce the number of readers. For that, my eternal gratitude to the author of the script which enables LaTeX embeddings :-)
But the boundig boxes around the equations should not be there. Heeeelp!

## Saturday, 17 February 2007

### David Kaplan on SUSY, in general

David had a busy week at CERN. Besides making a documentary thriller about the LHC and peforming at the theory seminar, he also gave a series of four lectures entitled Introduction to Supersymmetry. Somehow unexpectedly, behind this title hides a pretty basic introduction to supersymmetry. A particle theorist working beyond the standard model would not learn anything new from these lectures. For the remaining part of the population they offer a nice account of the current theoretical and experimental status of supersymmetry. David finds, i believe, a good balance between enthusiasm and skepticism.

The first lecture contains a review of the standard model and motivations for going beyond. The second is about constructing susy lagrangians, the one of the MSSM in particular. By the end of this lecture most of the experimentalists have vanished from the audience and David could move to more advanced subjects. The remaining two lectures introduce various models of supersymmetry breaking. A lot of time is devoted to discussing possible experimental signals, with more than usual emphasis on non-standard scenarios. The last lecture, in fact, has quite an overlap with the theory seminar.

The video recordings and the transparencies can be found here.

PS. I can't find a photo on the internet, except the one i already used in the previous post. Does anybody have a funny photo of David, e.g. standing on his head or parachute jumping?

## Wednesday, 14 February 2007

### David Kaplan on non-standard SUSY

Rumours of its death have been greatly exaggerated. Today, instead of interesting people, we had an interesting talk in our Wednesday TH seminar. David E. Kaplan is interesting in an interesting way. The good thing about it is that I don't need to devise any bonsmots for this post: i just copy&paste the ones he has said. I wish my task were always so easy ;-)

The reason for turning attention to non-standard susy scenarios is the fine-tuning problem of electroweak symmetry breaking. Yes, the very fine-tuning that used to be the main motivation for supersymmetry now has a supersymmetric avatar that plagues all simple susy models. By the way, David is an author of the most adequate definition of fine-tuning. It goes like this: a model is fine-tuned if a plot of the allowed parameter space makes you wanna puke.

As an illustration, today he presented this plot:
It shows the allowed parameter space in mSUGRA - the most popular of the MSSM scenarios and a common reference for setting experimental limits on susy particles. The allowed space is the narrow green band which looks like another line from a distance. Indeed, after seeing this one you will never put mSUGRA in your mouth again. The situation is slightly better in the general, unconstrained MSSM. But not much better. It is tough to reduce the fine-tuning below one part in ten. This is bad enough to justify serious theoretical and experimental studies of various extensions of the MSSM.

There are several directions one can pursue to reduce the fine-tuning. All have one thing in common. Since, by definition, the MSSM is minimal, the other scenarios get complicated. David chose the direction that can be summarized by: the higgs was at LEP but we were dumb enough to have missed it. The fine-tuning problem in supersymmetric models is fueled mostly by the 115 GeV limit on the higgs boson mass. But this limit stands for a higgs with the standard-model-like interactions. If the higgs decays are modified then this limit might become less stringent. With the higgs mass of order the Z mass the electroweak fine-tuning problem can be avoided.

With this in mind, David removes one of the MSSM assumptions: the R-parity conservation. R-parity was designed to prevent excessive proton decay, but if only some of the R-parity violating couplings are switched on we can get away with it. David adds the barion number violating UDD terms into to the MSSM superpotential. This opens a possibility of the higgs decaying into two neutralinos, each of which subsequently decays to 3 quarks (without R-parity, the lightest neutralino is no longer stable). A higgs decaying to six jets was not properly studied at LEP, and it could be as light as 80 GeV.

In the presence of R-parity violation susy phenomenology is dramatically modified. There is no stable LSP (lightest supersymmetric particle), hence no missing energy signatures. The bulk of David's talk was devoted to strategies of discovering the higgs and susy particles in this weird scenario. The funny thing is that the main role would fall to LHCb (usually considered a poor relative of ATLAS and CMS), as its design may allow to see displaced vertices associated with the neutralino decays.

Should we all believe in the R-parity violating MSSM then? Not necessarily. David's model is not perfectly natural itself, as hiding the higgs requires a non-generic choice of parameters. However, his talk made clear that our theoretical bias has badly influenced experimental searches for new physics (for example, non-standard higgs decays haven't been given enough attention). What's worse, the theoretical bias has promoted scenarios that nowadays seem implausible. After 30 years of physics beyond the standard model we realized we have no idea what should we expect at the LHC (unless it is just the standard model). So the clever thing to do now is to investigate as broad spectrum of new physics signals as possible. David's famous last words were: we should think what can we use a 14 TeV machine for, beyond killing the neighbour's dog.
The transparencies are not available, as usual (and Carthage must be destroyed). The paper of David and company is here.

Update: This post hit the charts because it contains the word puke. It's not me, it's David, i'm just reporting ;-) Pity, he didn't say anything with f..., i would certainly get even more hits.

## Sunday, 11 February 2007

### Alain Connes' Standard Model

Last Thursday Alain Connes gave a talk at CERN TH. Alain is a famous mathematician with important contributions in the areas of operator algebras and non-commutative geometry. He has gathered quite a collection of prestigious awards, including the Fields Medal and the Crafoord Prize. What could bring such a figure to a particle theory seminar? He was sharing his views on the elementary interactions in a talk entitled The Standard Model coupled with gravity, the spectral model and its predictions.

Alain's approach to particle physics is orthogonal to that of the particle physics community. Whereas we try to figure out what sort of new physics could be responsible for the weird structures of the Standard Model, he treats those very structures as an indication of the underlying geometry of space-time. This is certainly original, which has its positives and negatives. One one hand, I find it reassuring that people out there are exploring different ways; in the end, it is conceivable that the standard approach will prove terribly wrong. On the other hand, Alain's language can hardly be understood here at CERN. No, he wasn't speaking french ;-) but I quickly got lost in the forest of KO-theory, metric dimensions and spectral triples. I'm not able to review or evaluate any technical details of his work but I would like to share a few remarks anyway.

His program consists in identifying a structure of space-time could give rise to the Standard Model + gravity. He finds the answer is the product of an ordinary spin manifold by a finite noncommutative discrete space of KO-dimension 6 modulo 8 and of metric dimension 0, whatever it means. The discrete space is responsible for the spectrum, symmetries and the interactions of the Standard Model. Most of the Standard Model parameters correspond to the freedom of parametrizing the internal geometry. There are however three constraints:
1. The gauge couplings should be unified at some scale. The unification is rather weird, as there are no exotic gauge bosons, hence no proton decay.
2. There is a relation between the sum of the fermion masses squared and the W mass. In practice, this is a constraint on the top mass, which is roughly obeyed in nature.
3. Finally, there is a prediction for the higgs quartic coupling, which implies the higgs boson mass of order 170 GeV.
Is it possible that his approach will provide new insights into the Standard Model and beyond? Not likely. As far as I understood, the fine structure of space-time has no implications that could be observed at the LHC or in other experiments in foreseeable future. Next, the Standard Model is not a unique system that allows for such a geometrical embedding. Before the neutrino masses were discovered, Alain himself had pursued a different scenario leading to massless neutrinos. In fact, non-unification of the gauge couplings within the Standard Model suggests that there should be more low-energy structures asking for a different space-time geometry. According to Alain, supersymmetry could find a place in this game, too. Thus, his program can hardly constrain the options for the LHC. Even the 170 GeV Higgs mass is sensitive to the assumptions he makes, e.g. to the value of the unification scale. In conclusion, his approach seems more a mathematical re-interpretion of QFT structures than a self-standing physical theory.

In spite of these objections, I really enjoyed the talk. I think it is due to Alain's manner of speaking: a soft voice full of wonder at the mathematical beauty he perceives in his models. One could think he speaks of autumn trees or little birds in nest, not about scaring non-commutative geometry :-) This sort of enthusiam is rare these days.

The transperiences are not available, as usual. For brave souls, the technical details can be found in the recent paper of Alain and collaborators.

## Wednesday, 7 February 2007

### Interesting People

There's nothing interesting going on in particle theory these days. Somebody realized, finally. Therefore our weekly Wednesday theory seminars have been postponed until better days. Instead, a new Wednesday seminar series has been launched, that goes under the name Interesting People. Today we had a talk entitled In the coming weeks we expect the invisible Mr. T. Walters, the bicycle choir and a man with a tape recorder up his nose (to be confirmed). If you can set bricks to sleep, give a cat influenza or have any other remarkable skills, don't hesitate to contact the organizers of the Wednesday seminar.