Last Thursday we had a colloquium by Frank Wilczek here at CERN. Frank has made some impressive contributions to such different areas as astrophysics, particle physics and condensed matter physics. He has also provided a lot of beautiful insight into quantum field theory (asymptotic freedom, fractional statistics, color superconductivity). This was a good sign. On the other hand, Frank is also a Nobel-prize winner. This was a bad sign. Nobel-prize winners tend to fill their talks with banal statements written in large font to make a more profound impression. In the end, we observed a fight between good and bad. The latter being the winning side, i'm afraid.
Snide remarks aside, the colloquium had two separate parts. In the first one, Frank was advertising the possibility of phantoms appearing at the LHC. Phantoms refer to light scalar fields that are singlets under the Standard Model gauge group. It is impossible to write renormalizable interactions with the Standard Model fermions (except for the right-handed neutrino), which might be a good reason why we haven't observed such things so far. We can write, however, renormalizable interactions with the Higgs. Therefore the phantom sector could show up once we gain access to the Higgs sector.
Various better or worse motivated theories predict the existence of phantoms. Probably the best motivated phantom is the one especially dear to the speaker: the axion. This was the bridge to the second part of the talk, based on his paper from 2005, where Frank discussed the connection between axions, cosmology and ...the anthropic principle. Yes, Frank is another stray soul that has fallen under the spell of the anthropic principle.
Axions have been proposed to solve the theta-problem in QCD. As a bonus, they proved to be a perfect dark matter candidate. Their present abundance depends on two parameters: the axion scale f where the Peccei-Quinn symmetry is broken and the initial value of the axion field theta_0. The latter is usually expected to be randomly distributed because in the early hot universe no particular value is energetically favoured. With random theta_0 within the observable universe, there is the upper bound f <> 10^12 GeV.
The scenario with a low-scale inflation was the one discussed. Now theta_0 is a parameter randomly chosen by some cosmic accident. One can argue that the resulting probabilistic distribution of dark matter abundance (per log interval) is proportional to the square root of this abundance, favouring large values. Enters the anthropic principle. The observation is that too much dark matter could be dangerous for life. Frank made more precise points about halo formations, black holes, too close star encounters, matter cooling and so on. In short, using the anthropic principle one can cut off the large abundance tail of the probability distribution. One ends up with this plot:
The dotted line is the observed dark matter abundance. The claim is that axions combined with anthropic reasoning perfectly explain dark matter in the universe.
My opinion is that postdictions based on the anthropic principle aren't worth a penny. This kind of results relies mostly on our prejudices concerning the necessary conditions for life to develop. If they prove anything, it is rather limited human imagination (by the way, i once read an SF story about intelligent life formed by fluctuations on a black hole horizon :-) Only impressive, striking and unexpected predictions may count. That's what Weinberg did. That's why some exclaimed "Oh shit, Weinberg got it right". Nobody would ever use a swearword in reaction to the plot above...
For more details, consult the paper. If you are more tolerant to anthropic reasoning, here you can find the video recording.
Thursday, 31 May 2007
Tuesday, 29 May 2007
SN 1987A
Twenty years ago, Sanduleak 69 202a - a blue giant in the Large Magellanic Cloud - turned into a supernova before our very eyes. This very well designed cosmic experiment allowed to peer into supernova dynamics, learn about neutrino properties, put some constraints on physics beyond the Standard Model, and much more. And the pictures are so spectacular :-) Arnon Dar was telling us all about it at the TH seminar last Wednesday. So many things were said that I cannot report them all here. I just pick up a few stories.
Neutrinos: In some 15s, the SuperKamiokande and the IMB neutrino detectors registered a total of 20 neutrinos above the background. This was the first and so far the only detection of neutrinos from beyond the Solar System. Registering the neutrino pulse confirmed that the core collapse model of supernova explosion is roughly valid and that the gravitational energy of the collapse is released mostly into neutrinos. On the quantitative side, the neutrino pulse allowed to estimate the total energy and the temperature of the explosion. For the neutrino community, it provided constraints on the magnetic moment and the electric charge of neutrinos, as well as on their right-handed couplings.
New physics: From the duration of the neutrino pulse we know that other hypothetical light particles cannot release supernova energy too efficiently. This provides bounds on putative theories beyond the Standard Model. For example, one can derive an important upper bound on the axion mass, m < 0.01 eV.
Rings: The rings consist of gas ejected from the progenitor star some 20000 years prior to the explosion. It is not known precisely what shaped this beautiful structure. The rings were lightened by the supernova ejecta only several months after the explosion. The delay allowed to calculate the distance to the supernova: 168,000 light-years. Thus we know quite accurately the distance to the Large Magellanic Cloud, which is the important element in the cosmic distance ladder. In this way, SN1987A contributed to measuring the Hubble constant.
Light curve: The afterglow is powered by radioactive decay of the heavy elements produced during the explosion. The fall of the supernova brightness nicely fits into the picture of the radioactive decay chain: Nickel-56 -> Cobalt-56 -> Iron-56. This confirmed the expectations that the heavy elements in the Universe are produced and scattered by supernova explosions. The amount of Cobalt-56 produced could be quite accurately estimated to be 0.07 solar mass.
Missing remnant: Observations agree very well with models of the core collapsing into a neutron star. However the neutron star remnant has not been observed so far? Is it just veiled by dust or did something else form. A black hole? A hyperstar? A quark star?
Arnon spoke also about the current research in supernova physics. He devoted quite some time to gamma ray bursts and his cannonball model. But that would be too much for one post...
So much fun from such brief fireworks. No new physics was found, but tons of new astrophysics. Unfortunately, due to budget cuts in science, no other nearby supernova has been exploded in recent years. There are however serious projects to explode Eta Carinae in the Milky Way, see here or here. Let's wait and see ;-)
Transparencies not available, as usual.
Neutrinos: In some 15s, the SuperKamiokande and the IMB neutrino detectors registered a total of 20 neutrinos above the background. This was the first and so far the only detection of neutrinos from beyond the Solar System. Registering the neutrino pulse confirmed that the core collapse model of supernova explosion is roughly valid and that the gravitational energy of the collapse is released mostly into neutrinos. On the quantitative side, the neutrino pulse allowed to estimate the total energy and the temperature of the explosion. For the neutrino community, it provided constraints on the magnetic moment and the electric charge of neutrinos, as well as on their right-handed couplings.
New physics: From the duration of the neutrino pulse we know that other hypothetical light particles cannot release supernova energy too efficiently. This provides bounds on putative theories beyond the Standard Model. For example, one can derive an important upper bound on the axion mass, m < 0.01 eV.
Rings: The rings consist of gas ejected from the progenitor star some 20000 years prior to the explosion. It is not known precisely what shaped this beautiful structure. The rings were lightened by the supernova ejecta only several months after the explosion. The delay allowed to calculate the distance to the supernova: 168,000 light-years. Thus we know quite accurately the distance to the Large Magellanic Cloud, which is the important element in the cosmic distance ladder. In this way, SN1987A contributed to measuring the Hubble constant.
Light curve: The afterglow is powered by radioactive decay of the heavy elements produced during the explosion. The fall of the supernova brightness nicely fits into the picture of the radioactive decay chain: Nickel-56 -> Cobalt-56 -> Iron-56. This confirmed the expectations that the heavy elements in the Universe are produced and scattered by supernova explosions. The amount of Cobalt-56 produced could be quite accurately estimated to be 0.07 solar mass.
Missing remnant: Observations agree very well with models of the core collapsing into a neutron star. However the neutron star remnant has not been observed so far? Is it just veiled by dust or did something else form. A black hole? A hyperstar? A quark star?
Arnon spoke also about the current research in supernova physics. He devoted quite some time to gamma ray bursts and his cannonball model. But that would be too much for one post...
So much fun from such brief fireworks. No new physics was found, but tons of new astrophysics. Unfortunately, due to budget cuts in science, no other nearby supernova has been exploded in recent years. There are however serious projects to explode Eta Carinae in the Milky Way, see here or here. Let's wait and see ;-)
Transparencies not available, as usual.
Sunday, 20 May 2007
LHC Roulette
There are no seminars these days that would be interesting enough to report on. To keep blogging, i'm afraid, i have to present you with some musings of my own. I always wanted to spell out my expectations concerning what the LHC would discover. We know for sure that it will reveal the mechanism of electroweak symmetry breaking. But for the first time in many years the outcome of a collider experiment is not obvious before the start. Now that the LHC starting date has been delayed again, it is the perfect time to play guessing games.
This is my pie:Unexpected 40%, Standard Model 40%, Strong Dynamics 15%, Exotics 5%.
Why do I expect Unexpected? Although the Standard Model describes the data perfectly well, it leaves many questions unanswered. And my feeling is that none of its extensions on the market really answers these questions. At least, not more than one at the same time. Of course, the most publicized is the hierarchy problem, which all the current models find difficult to handle. Less discussed, but perhaps more troubling is the question of fermion masses. Electroweak symmetry breaking should produce masses not only to the W and Z, but also to the fermions. We observe striking patterns in fermion masses, but we have no clue where they come form. For example, the up and down quarks are almost degenerate in mass but isospin is badly broken in the third generation. Besides that, there is the question of dark matter. While many models accommodate a dark matter candidate, they rarely explain the striking fact that the baryonic and the dark matter abundances are so close to each other. I maintain the childlike belief that there could be a simple and spectacular answer to some of these question within an ordinary quantum field theory. That there is something we are missing. Something obvious in hindsight, but very tough to pinpoint without a hint from experiment. Like anomalies in the old days.
Next, the worst of all worlds: Standard Model with a fairly light higgs boson. Unfortunately, all the collider data point to this possibility. Fermion masses and dark matter could be the results of some obscure physics at other energy scales. The hierarchy problem could well be the hierarchy fact, leaving us prey to landscape speculations. Really, a nightmare scenario and probably the end of particle physics as we know it.
Strong dynamics with a composite higgs or no higgs at all. This is a very natural possibility. A similar mechanism has been chosen by nature to break the chiral symmetry of QCD. Qualitatively, that breaking could also be described by a scalar field that gets a vev (the linear sigma model). At the deeper level, the scalar turns out to be condensing quark-antiquark pairs. However, it is hard to imagine how strong interactions at TeV could remain hidden from our precise low energy experiments. There are thousand places where the effects could in principle show up: flavour, CP violation, electroweak precision tests. We see none. The most troubling question is: why would nature design a theory with the seemingly sole purpose to hide it from our low-energy eyes? Nevertheless, there is always hope that there is some misunderstanding due to our ignorance of strong interactions.
Finally, Exotics. In this category I put all the crowning achievements of theoretical particle physics in the last 30 years. Supersymmetry, large extra dimensions, Little Higgs, TeV strings, TeV unicorns... Most likely, in a few years all these theories will be given a prominent place in the history of science, along with flogiston and Lamarckian evolution. They all have the same philosophical problem as the TeV scale strong interactions: why would someone take so much trouble to hide their effects from low-energy experiments. Only that they seem far less natural than strong dynamics. Why did i give as much as 5 percent then? Well, there is a large volume effect here ;-)
That's from me. As you can see, i'm naiveness and skepticism in one. If you're more confident about a particular existing model (beyond the Standard Model) you're welcome to lose 1000 dollars in Tommaso's online house of games.
This is my pie:Unexpected 40%, Standard Model 40%, Strong Dynamics 15%, Exotics 5%.
Why do I expect Unexpected? Although the Standard Model describes the data perfectly well, it leaves many questions unanswered. And my feeling is that none of its extensions on the market really answers these questions. At least, not more than one at the same time. Of course, the most publicized is the hierarchy problem, which all the current models find difficult to handle. Less discussed, but perhaps more troubling is the question of fermion masses. Electroweak symmetry breaking should produce masses not only to the W and Z, but also to the fermions. We observe striking patterns in fermion masses, but we have no clue where they come form. For example, the up and down quarks are almost degenerate in mass but isospin is badly broken in the third generation. Besides that, there is the question of dark matter. While many models accommodate a dark matter candidate, they rarely explain the striking fact that the baryonic and the dark matter abundances are so close to each other. I maintain the childlike belief that there could be a simple and spectacular answer to some of these question within an ordinary quantum field theory. That there is something we are missing. Something obvious in hindsight, but very tough to pinpoint without a hint from experiment. Like anomalies in the old days.
Next, the worst of all worlds: Standard Model with a fairly light higgs boson. Unfortunately, all the collider data point to this possibility. Fermion masses and dark matter could be the results of some obscure physics at other energy scales. The hierarchy problem could well be the hierarchy fact, leaving us prey to landscape speculations. Really, a nightmare scenario and probably the end of particle physics as we know it.
Strong dynamics with a composite higgs or no higgs at all. This is a very natural possibility. A similar mechanism has been chosen by nature to break the chiral symmetry of QCD. Qualitatively, that breaking could also be described by a scalar field that gets a vev (the linear sigma model). At the deeper level, the scalar turns out to be condensing quark-antiquark pairs. However, it is hard to imagine how strong interactions at TeV could remain hidden from our precise low energy experiments. There are thousand places where the effects could in principle show up: flavour, CP violation, electroweak precision tests. We see none. The most troubling question is: why would nature design a theory with the seemingly sole purpose to hide it from our low-energy eyes? Nevertheless, there is always hope that there is some misunderstanding due to our ignorance of strong interactions.
Finally, Exotics. In this category I put all the crowning achievements of theoretical particle physics in the last 30 years. Supersymmetry, large extra dimensions, Little Higgs, TeV strings, TeV unicorns... Most likely, in a few years all these theories will be given a prominent place in the history of science, along with flogiston and Lamarckian evolution. They all have the same philosophical problem as the TeV scale strong interactions: why would someone take so much trouble to hide their effects from low-energy experiments. Only that they seem far less natural than strong dynamics. Why did i give as much as 5 percent then? Well, there is a large volume effect here ;-)
That's from me. As you can see, i'm naiveness and skepticism in one. If you're more confident about a particular existing model (beyond the Standard Model) you're welcome to lose 1000 dollars in Tommaso's online house of games.
Wednesday, 16 May 2007
TH Upgrade
While problems are mounting up for the LHC, things are going ever so smoothly in the Theory Division. Our working conditions are constantly improving. The coffee machine that had witnessed the glorious eighties has been replaced by a new one that actually makes coffee. The common room has been renovated and equipped with blackboards, on which it is actually possible to write. The renovation works in the cafeteria are coming to an end and there is hope the queuing time will drop below 30 minutes. By the way, The New Yorker wrote recently that ...man can build a superconducting collider but not a functional cafeteria. It seems the guy was wrong. Twice ;-/
With all these improvements, productivity of CERN theorists is expected to increase in the coming months. Or decrease...the new sofas are damn comfortable :-) Time will tell.
With all these improvements, productivity of CERN theorists is expected to increase in the coming months. Or decrease...the new sofas are damn comfortable :-) Time will tell.
Sunday, 13 May 2007
T-parity Shot Down?
Little Higgs is a framework designed for breaking the electroweak symmetry of the Standard Model without running into the hierarchy problem. Roughly speaking, the idea is that some new strong interactions at 10 TeV produce bound states, a subset of which ends up being much lighter than 10 TeV. These mesons, described as pseudo-goldstone bosons, are identified with the Higgs field in the Stadard Model. All this may appear involved, but similar dynamics is observed in the real-life QCD, where relatively light pions emerge as bound states. In Little Higgs, somewhat more complicated structures (new gauge bosons and new quarks) at the TeV scale are required to keep the electroweak scale ~ 100 GeV stable.
Little Higgs models have received a lot of attention and there is a good hope that something like that may turn up at the LHC. However, generic models have a hard time to comply with electroweak precision constraints. One way out is to introduce a parity symmetry that forces the new TeV scale particles to couple only in pairs to the Standard Model. In such a case, the new particles can contribute to the electroweak observables only via loop processes and their contribution may be sufficiently suppressed. Such a symmetry, dubbed T-parity, were introduced by Cheng and Low. Little Higgs models with T parity, although somewhat more involved than the minimal ones, lead to electroweak symmetry breaking without fine-tuning. As a by-product, the lightest particle with negative T parity, usually one of the new neutral gauge bosons, is forbidden to decay. This 'heavy photon' thus constitutes a nice dark matter candidate. Perfect.
From the recent paper by Hill&Hill it follows that things are not so bright. The paper is rather technical but the message is clear. Little Higgs models are usually formulated as a low energy effective theory without bothering about the strongly interacting theory at 10 TeV that gave birth to it. Hill^2 argue that in any conceivable UV completion of the Little Higgs models the T-parity is broken. The effect is induced by anomalies and is similar in spirit to that allowing the pion to decay into two photons. At the technical level, the low energy lagrangian of Little Higgs models should be augmented with the Wess-Zumino-Witten term that reflects the structure of anomalies in the UV theory. This WZW term breaks T-parity
What are the consequences? The new T-parity breaking terms should not mess up with the precision electroweak observables as they are related to anomalies and therefore loop-suppressed. But the lightest T-odd particle is no longer stable and does not constitute a good dark matter candidate. In particular, the heavy photon may decay into two W bosons, too quickly to play a role of a dark matter particle.
In the earlier paper by the same authors you can find more technical details about the WZW terms in the context of Little Higgs.
What if the heavy photon is lighter than a pair of W bosons? Can the day be saved?
Little Higgs models have received a lot of attention and there is a good hope that something like that may turn up at the LHC. However, generic models have a hard time to comply with electroweak precision constraints. One way out is to introduce a parity symmetry that forces the new TeV scale particles to couple only in pairs to the Standard Model. In such a case, the new particles can contribute to the electroweak observables only via loop processes and their contribution may be sufficiently suppressed. Such a symmetry, dubbed T-parity, were introduced by Cheng and Low. Little Higgs models with T parity, although somewhat more involved than the minimal ones, lead to electroweak symmetry breaking without fine-tuning. As a by-product, the lightest particle with negative T parity, usually one of the new neutral gauge bosons, is forbidden to decay. This 'heavy photon' thus constitutes a nice dark matter candidate. Perfect.
From the recent paper by Hill&Hill it follows that things are not so bright. The paper is rather technical but the message is clear. Little Higgs models are usually formulated as a low energy effective theory without bothering about the strongly interacting theory at 10 TeV that gave birth to it. Hill^2 argue that in any conceivable UV completion of the Little Higgs models the T-parity is broken. The effect is induced by anomalies and is similar in spirit to that allowing the pion to decay into two photons. At the technical level, the low energy lagrangian of Little Higgs models should be augmented with the Wess-Zumino-Witten term that reflects the structure of anomalies in the UV theory. This WZW term breaks T-parity
What are the consequences? The new T-parity breaking terms should not mess up with the precision electroweak observables as they are related to anomalies and therefore loop-suppressed. But the lightest T-odd particle is no longer stable and does not constitute a good dark matter candidate. In particular, the heavy photon may decay into two W bosons, too quickly to play a role of a dark matter particle.
In the earlier paper by the same authors you can find more technical details about the WZW terms in the context of Little Higgs.
What if the heavy photon is lighter than a pair of W bosons? Can the day be saved?
Sunday, 6 May 2007
LHC Rumours
Blog is a perfect place to spread wild rumours. I would prefer to gossip about fantastic new discoveries at CERN. Instead, all i have are the rumours of delays. Those now travelling down the corridors of CERN are out-bloody-rageous. It seems that the LHC 0.9 TeV pilot run scheduled for November will be cancelled. The recent accident with the magnets is not the primary cause but rather the last straw. The LHC will go online next year, immediately in its full 14 TeV glory. But there is little chance of this happening in spring 2008; summer 2008 is a more realistic date. All I've said is by no means official but, according to well-informed sources, inescapable.
Until then...
Besides, i will be online to help you bide the time ;-)
Until then...
Besides, i will be online to help you bide the time ;-)
Saturday, 5 May 2007
Quantum Mechanics and Spacetime
I was rather harsh with James Hartle about the colloqium he gave the other day. As a rule, i'm not picky but i'm more demanding of celebrities. Last week, James also gave a theory seminar that i didn't like either. However, this time i'm going to hold my temper and just report the story told.
The title was Generalizing quantum mechanics for quantum spacetime. The usual formulation of quantum mechanics, with the Schrödinger equation and the unitary evolution of states, is based on predefined time and space. James was discussing a framework that is suitable for a system with no fixed spacetime geometry. One obvious motivation for such a generalization is the application to the early universe.
The framework discussed in the seminar derives from the quantum mechanics of closed systems. This formulation requires the Hamiltonian and the initial quantum state as an input, but does not rely on the notions of classical regime, external observers or measurement. The key object is a set of coarse-grained alternative histories of the particles in a system. These are bundles of fine-grained histories - the Feynman paths of particles. For example, a coarse-grained history could be a position of the center-of-mass of the Earth at a certain accuracy. One defines branch state vectors and the decoherence functional to quantify the quantum interference between alternative histories. Probabilities can be asssigned to sets of histories for which the interference between its members is negligible.
In the presence of a fixed spacetime geometry, in which we can define the timelike direction and the foliation into spacelike surfaces, this formulation can be proven equivalent to the standard one. Basically, this is just the equivalence between the Feynman path integral and the usual formulation of quantum mechanics. James argued that this formulation can accommodate quantum spacetime too. In this case, the alternative histories do not represent evolution in spacetime, but evolution of spacetime. The key object would be a set of coarse-grained histories of a metric and matter fields on a fixed manifold.
Although all this sounds plausible, not much more than words words words were presented during the seminar. There was not the slightest mention of a possible experimental verification of these ideas. Another worry is that this formulation, by itself, does not address the divergences problem of quantum gravity. In all known cases where this is taken care of, e.g. in string theory, the spacetime description breaks down at some energy scale. It is not clear how the alternative histories formulation could fit in such a picture.
The transparencies, of course, are not available (why oh why?). Luckily enough, James has this article on the archive that pretty well covers the material presented in the seminar.
The title was Generalizing quantum mechanics for quantum spacetime. The usual formulation of quantum mechanics, with the Schrödinger equation and the unitary evolution of states, is based on predefined time and space. James was discussing a framework that is suitable for a system with no fixed spacetime geometry. One obvious motivation for such a generalization is the application to the early universe.
The framework discussed in the seminar derives from the quantum mechanics of closed systems. This formulation requires the Hamiltonian and the initial quantum state as an input, but does not rely on the notions of classical regime, external observers or measurement. The key object is a set of coarse-grained alternative histories of the particles in a system. These are bundles of fine-grained histories - the Feynman paths of particles. For example, a coarse-grained history could be a position of the center-of-mass of the Earth at a certain accuracy. One defines branch state vectors and the decoherence functional to quantify the quantum interference between alternative histories. Probabilities can be asssigned to sets of histories for which the interference between its members is negligible.
In the presence of a fixed spacetime geometry, in which we can define the timelike direction and the foliation into spacelike surfaces, this formulation can be proven equivalent to the standard one. Basically, this is just the equivalence between the Feynman path integral and the usual formulation of quantum mechanics. James argued that this formulation can accommodate quantum spacetime too. In this case, the alternative histories do not represent evolution in spacetime, but evolution of spacetime. The key object would be a set of coarse-grained histories of a metric and matter fields on a fixed manifold.
Although all this sounds plausible, not much more than words words words were presented during the seminar. There was not the slightest mention of a possible experimental verification of these ideas. Another worry is that this formulation, by itself, does not address the divergences problem of quantum gravity. In all known cases where this is taken care of, e.g. in string theory, the spacetime description breaks down at some energy scale. It is not clear how the alternative histories formulation could fit in such a picture.
The transparencies, of course, are not available (why oh why?). Luckily enough, James has this article on the archive that pretty well covers the material presented in the seminar.
Thursday, 3 May 2007
Physics, Math and Comics
People around complain that Resonaances has become too serious. It has even been linked as "serious hep-ph" by one of my fellow bloggers. To be sincere, writing about particle theory is what i enjoy the most. But being taken seriously is the last thing i wish. So i have made a commitment of posting light articles once a week (twice a week during Lent). This is my first post under label Distraction.
Recently, i was told about xkcd and came to like it a lot. The site is run by Randall Munroe, a physicist and engineer from Virginia. Randall presents cheerful comics with lots of self-ironic reference to physics and math. I consider his artwork an excellent comment on this peculiar state of mind that develops here at CERN ;-)
This is my favourite:
I wonder why i like it...have i been acting strangely, recently?
This one, i believe, should make all bloggers smile
There is much more. Enjoy. New comics appear every week. If you're curious what xkcd stands for, here is the explanation.
Recently, i was told about xkcd and came to like it a lot. The site is run by Randall Munroe, a physicist and engineer from Virginia. Randall presents cheerful comics with lots of self-ironic reference to physics and math. I consider his artwork an excellent comment on this peculiar state of mind that develops here at CERN ;-)
This is my favourite:
I wonder why i like it...have i been acting strangely, recently?
This one, i believe, should make all bloggers smile
There is much more. Enjoy. New comics appear every week. If you're curious what xkcd stands for, here is the explanation.
Subscribe to:
Posts (Atom)