Saturday, 25 August 2007
LHC: The First Year
The New Physics workshop is at its peak. The TH seminar room is cracking in its seams and there are at least two talks every day. Unfortunately, the theory talks last week were ranging from not-so-exctiting to pathetic. Therefore I clench my teeth and report on a talk by an experimentalist. Fabiola Gianotti was talking about the ATLAS status and plans for physics with first data.
Experimentalists (much as happy families) are all alike. They can never resist showing us a hundred spectacular pictures of their cherished detectors and another hundred showing the tunnel at sunrise and sunset. They feel obliged to inform how many kilometers of cable was wasted for each particular part of a detector. They stuff each cm^2 of the transparancies with equally indispensable pieces of information. Usually, this leaves little space for interesting physics. This time, however, was slightly different. Having finished with the pictures, Fabiola told us several interesting things about early physics with the ATLAS detector.
ATLAS is already alive, kicking and collecting data from cosmic-ray muons. The LHC will go online in Spring 2008. That is to say, if all goes smoothly, if nothing else explodes and if Holger Nielsen refrains from pulling the cards. In the unlikely case of everything going as planned, the first collisions at 14 TeV will take place in July. The plan for ATLAS is to collect a hundred inverse picobarn of data by the end of the year, and a few inverse femtobarn by the end of 2009.
The main task in 2008 will be to discover the Standard Model. 100/pb translates roughly to 10^6 W bosons and to 10^5 Z boson. The decays of W and Z have been precisely measured before, so this sample can be used for callibrating the detectors. We will also see some 10^4 top-antitop pairs with fully- or semi-leptonic decays. Thus, the top will be detected on european soil and we will know for sure that the Tevatron didnt make it up. In the first year, however, the top samples will serve no purpose other than callibration. For example, the top mass resolution will be of order 10 GeV (the ultimate precision is 1 GeV but this is a song of the future), far worse than the current Tevatron sensitivity. Apart from that, the QCD jets background at high pT will be measured, something notoriously difficult to estimate theoretically. In short, 2008 studies will be boring but necessary to prepare the stage for future discoveries.
Is there any hope for a discovery in 2008? Fabiola pointed out the case of a 1 TeV vector resonance decaying to an electron pair. This would stand out like a lamppost over the small and well-understood Drell-Yann background and could be discovered with as little as 70/pb of data. The problem is that LEP and LEP2 put an indirect constraint on the mass of such a resonance to be larger than a few TeV. So not much hope for that.
Another hope for an early discovery is supersymmetry with its multitude of coloured particles. From the plot we can see that a gluino lighter than 1.5 TeV could be discovered with 100/pb of data. A quick discovery would be important, as it would set the green light for the ILC project. The problem in this case is that a discovery requires a good understanding of the missing energy spectra. Most likely, we will have to wait till 2009.
The higgs boson is a more difficult case. From the plot below you can see that there is no way to see anything at 100/pb. However, with a few inverse picobarns the whole interesting range of higgs masses will be covered. Thus, according to Fabiola, the higgs puzzle should be resolved by the end of 2009.
Last thing worth mentioning is the ongoing effort to visualize the huge kinetic energy that will be stored in the LHC beam. This time the energy was compared to that of a British aircraft carrier running at 12 knots. The bottom line is the following. If you spot an aircraft carrier on Lac Leman, don't panic, it's just the LHC that lost the beam.
The slides are available via the workshop page here.
Friday, 17 August 2007
Restoration of the Fourth Generation
I'm not as successful as the others in spreading rumours. So i'm sadly returning to writing about things i've seen with my very eyes. Last week there has been several talks that would deserve a post. I pick up perhaps not the most interesting but the most comprehensible one: Graham Kribs talking about Four Generations and Higgs Physics.
There are three generations of quarks and leptons. Everyone knows that the fourth one does not exist. The Bible says it is excluded at 99.999% confidence level. Graham is spreading the heresy of claiming otherwise. He adds the 4th generation of chiral matter to the Standard Model and reanalyses the constraints on the parameters of such a model.
First of all, there are limits from direct searches. The LEPII experiment set the lower limit of roughly 100 GeV on the masses of the 4th electron and neutrino. The bounds on the 4th generation quarks from the Tevatron are more stringent. CDF excludes the 4th up-type quark lighter than 256 GeV, and Graham argues that a similar bound should hold for a down-type quark.
Indirect limits from flavour physics can be taken care of by asuming that the mixing between the three and the fourth generations is small enough. More troubling are the contraints from the electroweak precision tests. For example, the new quark doublet contributes to the S paramater:
$\Delta S = \frac{N_c}{6 \pi} \left ( 1 - 2 Y \log \frac{m_u^2}{m_d^2} \right )$
If the 4th generation quarks are degenarate in mass, the S parameter comes out too large. Splitting the masses could help keeping the S parameter under control, though it generates new contributions to the T parameter. Nevertheless, Graham argues that there is a large enough window, with the up-type quark some 50 GeV heavier than the down-type quark, where all the precision constraints are satisfied.
The fourth generation may be discovered by direct searches at the Tevatron or at the LHC. But its most dramatic effect would be a profound modification of the higgs physics. Within the Standard Model, the higgs is produced in a hadron collider dominantly via the gluon fusion:
The particle in the loop is the top quark - the only coloured particle in the Standard Model with a sizable coupling to the higgs boson. With the 4th generation at hand, we have two more quarks strongly coupled to the higgs boson. As a result, the higgs production cross section dramatically increases, roughly by the factor of 9. The first hint of the 4th generation could come from the Tevatron who would see the higgs with an abnormally large cross section. In fact, the Tevatron has already excluded the 4th generation scenario for a range of the higgs boson mass (see the latest higgs exclusion plot here).
The slides from this and other talks can be found here. If interested in more details, look at the recent paper of Graham. While reading, try not to forget that the 4th generation does not exist.
There are three generations of quarks and leptons. Everyone knows that the fourth one does not exist. The Bible says it is excluded at 99.999% confidence level. Graham is spreading the heresy of claiming otherwise. He adds the 4th generation of chiral matter to the Standard Model and reanalyses the constraints on the parameters of such a model.
First of all, there are limits from direct searches. The LEPII experiment set the lower limit of roughly 100 GeV on the masses of the 4th electron and neutrino. The bounds on the 4th generation quarks from the Tevatron are more stringent. CDF excludes the 4th up-type quark lighter than 256 GeV, and Graham argues that a similar bound should hold for a down-type quark.
Indirect limits from flavour physics can be taken care of by asuming that the mixing between the three and the fourth generations is small enough. More troubling are the contraints from the electroweak precision tests. For example, the new quark doublet contributes to the S paramater:
$\Delta S = \frac{N_c}{6 \pi} \left ( 1 - 2 Y \log \frac{m_u^2}{m_d^2} \right )$
If the 4th generation quarks are degenarate in mass, the S parameter comes out too large. Splitting the masses could help keeping the S parameter under control, though it generates new contributions to the T parameter. Nevertheless, Graham argues that there is a large enough window, with the up-type quark some 50 GeV heavier than the down-type quark, where all the precision constraints are satisfied.
The fourth generation may be discovered by direct searches at the Tevatron or at the LHC. But its most dramatic effect would be a profound modification of the higgs physics. Within the Standard Model, the higgs is produced in a hadron collider dominantly via the gluon fusion:
The particle in the loop is the top quark - the only coloured particle in the Standard Model with a sizable coupling to the higgs boson. With the 4th generation at hand, we have two more quarks strongly coupled to the higgs boson. As a result, the higgs production cross section dramatically increases, roughly by the factor of 9. The first hint of the 4th generation could come from the Tevatron who would see the higgs with an abnormally large cross section. In fact, the Tevatron has already excluded the 4th generation scenario for a range of the higgs boson mass (see the latest higgs exclusion plot here).
The slides from this and other talks can be found here. If interested in more details, look at the recent paper of Graham. While reading, try not to forget that the 4th generation does not exist.
Sunday, 12 August 2007
Model Building Month
Today the TH Institute "New Physics and the LHC" kicks off here at CERN. As the organizers put it, it will be mostly targeted to model-builders, with the participation of string theorists and collider phenomenologists. Regardless of that looming participation, the program looks quite promising. After the dog days of summer you can count on more activity in this blog in the coming weeks.
Also today, on the other end of the world began Lepton-Photon'07. I wouldn't bother to mention if not for certain rumours. Rumours of rumours, so to say. I mean the rumours of the alleged non-standard higgs signal at the Tevatron whispered from blog to blog since May. The new rumour is that the relevant multi-b channel analysis will be finally presented at that conference. In spite of the fact that there are neither photons nor leptons in the signal :-)
Update: well....nothing like that seems to have happened. My deep throat turned out shallow. So we are left to rumours and gossips for some more time....
Update #2: Finally, only CDF presented a new analysis of that channel. Their excess is 1.5 only sigma. Details in this public note, a wrap-up by Tommaso here. Now we're waiting for D0. Come on guys, dont be ashamed...
Also today, on the other end of the world began Lepton-Photon'07. I wouldn't bother to mention if not for certain rumours. Rumours of rumours, so to say. I mean the rumours of the alleged non-standard higgs signal at the Tevatron whispered from blog to blog since May. The new rumour is that the relevant multi-b channel analysis will be finally presented at that conference. In spite of the fact that there are neither photons nor leptons in the signal :-)
Update: well....nothing like that seems to have happened. My deep throat turned out shallow. So we are left to rumours and gossips for some more time....
Update #2: Finally, only CDF presented a new analysis of that channel. Their excess is 1.5 only sigma. Details in this public note, a wrap-up by Tommaso here. Now we're waiting for D0. Come on guys, dont be ashamed...
Saturday, 11 August 2007
Entropic Principle
I'm back. Just in time to report on the curious TH seminar last Wednesday. Jim Cline was talking about The entropic approach to understanding the cosmological constant. I am lucky to live in a special moment of history and to observe fascinating sociological processes going on in particle physics. I'm dying to see where this will lead us...
Everyone knows about the anthropic principle. In short, certain seemingly fundamental parameters are assumed to be enviromental variables that take random values in different parts of the universe. We can live only in those corners where the laws of physics support more or less intelligent life. This could explain why the values of certain physical parameters appear fine-tuned.
One undeniable success of this approach is Weinberg's prediction concerning the cosmological constant. Weinberg assumed that intelligent life needs galaxies. Keeping all other cosmological parameters fixed, galaxies would not form if the energy density in the cosmological constant exceeded that in matter today by roughly a factor of 5000. If all values of the cosmological constant are equally probable we should observe a value close to the upper limit. Weinberg's explanation points to a value some 1oo0 times larger than the observed one. However it remains attractive, lacking any fundamental understanding of the smallness of the cosmological constant.
Weinberg's paper appeared in 1987, well before the first indications for accelerated expansion from supernovae started to show up. That was a prediction in the common-sense use of this word. Since then, the anthropic principle was mostly employed to predict things we already know. The successful applications include the electroweak breaking scale, the dark matter abundance and the CMB temperature. As for the things we don't know, predictions based on the anthropic principle turn out to be less sharp. For example, the supersymmetry breaking scale is predicted by anthropic reasoning to be at a TeV, or at the Planck scale, or somewhere in between.
The entropic principle, proposed earlier this year by Bousso et al, is a new twist in this merry game. The idea is to replace the selection criterion based on the existence of intelligent life with something more objective. Bousso et al argue that the right criterion is maximizing the entropy production in a causal diamond. The argument is that life of any sort cannot exist in a thermal equilibrium but requires free energy. They argue that free energy divided by the temperature at which it is radiated (that is the entropy increase) is a good measure of a complexity that may arise in a given spacetime region. They proceed to calculating the entropy production (dominated by infrared radiation of dust heated by starlight) for various values of the cosmological constant and find that it is peaked close to the observed value. This allows them to conclude that ...this result improves our confidence in the entropic principle and the underlying landscape.... More details on this computation can be found in the paper or in this post on Reference Frame.
Jim in his talk reviewed all that minus the sneer. He also played his own part in the game. He predicted the primordial density contrast. His conclusion is that the observed value 10^(-5) is not unlikely, which further improves our confidence in the entropic principle and the underlying landscape. Pushing the idea to the edge of absurd, he also made an attempt to predict the dark matter abundance. He took an obscure model of gravitino dark matter with non-conserved R-parity. In this model, the dark matter particles decay, thus producing entropy. He argued that the entropy production is maximized close to the observed value of the dark matter abundance, which further improves our confidence in the entropic principle and the underlying landscape. I guess i missed something here. The observed universe is as we see it because the entropy production from the synchrotron radiation of the dark matter decay products may support some odd forms od life? At this point, pulling cards does not seem such a weird idea anymore...
Everyone knows about the anthropic principle. In short, certain seemingly fundamental parameters are assumed to be enviromental variables that take random values in different parts of the universe. We can live only in those corners where the laws of physics support more or less intelligent life. This could explain why the values of certain physical parameters appear fine-tuned.
One undeniable success of this approach is Weinberg's prediction concerning the cosmological constant. Weinberg assumed that intelligent life needs galaxies. Keeping all other cosmological parameters fixed, galaxies would not form if the energy density in the cosmological constant exceeded that in matter today by roughly a factor of 5000. If all values of the cosmological constant are equally probable we should observe a value close to the upper limit. Weinberg's explanation points to a value some 1oo0 times larger than the observed one. However it remains attractive, lacking any fundamental understanding of the smallness of the cosmological constant.
Weinberg's paper appeared in 1987, well before the first indications for accelerated expansion from supernovae started to show up. That was a prediction in the common-sense use of this word. Since then, the anthropic principle was mostly employed to predict things we already know. The successful applications include the electroweak breaking scale, the dark matter abundance and the CMB temperature. As for the things we don't know, predictions based on the anthropic principle turn out to be less sharp. For example, the supersymmetry breaking scale is predicted by anthropic reasoning to be at a TeV, or at the Planck scale, or somewhere in between.
The entropic principle, proposed earlier this year by Bousso et al, is a new twist in this merry game. The idea is to replace the selection criterion based on the existence of intelligent life with something more objective. Bousso et al argue that the right criterion is maximizing the entropy production in a causal diamond. The argument is that life of any sort cannot exist in a thermal equilibrium but requires free energy. They argue that free energy divided by the temperature at which it is radiated (that is the entropy increase) is a good measure of a complexity that may arise in a given spacetime region. They proceed to calculating the entropy production (dominated by infrared radiation of dust heated by starlight) for various values of the cosmological constant and find that it is peaked close to the observed value. This allows them to conclude that ...this result improves our confidence in the entropic principle and the underlying landscape.... More details on this computation can be found in the paper or in this post on Reference Frame.
Jim in his talk reviewed all that minus the sneer. He also played his own part in the game. He predicted the primordial density contrast. His conclusion is that the observed value 10^(-5) is not unlikely, which further improves our confidence in the entropic principle and the underlying landscape. Pushing the idea to the edge of absurd, he also made an attempt to predict the dark matter abundance. He took an obscure model of gravitino dark matter with non-conserved R-parity. In this model, the dark matter particles decay, thus producing entropy. He argued that the entropy production is maximized close to the observed value of the dark matter abundance, which further improves our confidence in the entropic principle and the underlying landscape. I guess i missed something here. The observed universe is as we see it because the entropy production from the synchrotron radiation of the dark matter decay products may support some odd forms od life? At this point, pulling cards does not seem such a weird idea anymore...
Thursday, 2 August 2007
Closed for Summer
Yeah....it's more than two weeks since my last post. I'm afraid you need even more patience. Summer is conference time and holiday time, and hell lot of work to do in between. Clearly, i'm not one of them Bloggin' Titans like him or him or her who would keep blogging from a sinking ship if they had such an opportunity.
My schedule is hopelessly tight until mid-August. After that i should resume my average of 1.5 post per week. Or maybe more, because much will be going on at CERN TH. I'll be back.
Subscribe to:
Posts (Atom)