In the best of all worlds, all experimental data acquired by humanity would be stored in a convenient format and could be freely accessed by everyone. Believe it or not, the field of astrophysics is not so far from this utopia. The policy of the biggest sponsors in that field - NASA and ESA - is to require that more-or-less data (sometimes a pre-processed form) are posted some time, typically 1-2 years, after the experiment starts. This policy is followed by such cutting-edge experiments as WMAP, FERMI, or, in the near future, Planck. And it is not a futile gesture: quite a few people from outside of these collaborations have made a good use these publicly available data, and more than once maverick researchers have made important contributions to physics.
Although the above open-access approach appears successful, it is not being extended to other areas of fundamental research. There is a general consensus that in particle physics an open-access approach could not work because:
- bla bla bla,
- tra ta ta tra ta ta,
- chirp chirp,
- no way.
Enter RECAST, a semi-automated framework enabling to recycle existing analyses so as to test for alternative signals. The idea goes as follows. Imagine that a collaboration performs a search for a fancy new physics model. In practice, what is searched for is a set of final states particles, say, a pair of muons, jets with unbalanced transverse energy, etc. The same final state may arise in a large class of models, many of which the experimenters would not think of, or which might not even exist at the time the analysis is done. The idea of RECAST is to provide an interface via which theorists or other experimentalists could submit a new signal (simply at the partonic level, in some common Les Houches format). RECAST would run the new signal through the analysis chain, including hadronization, detector simulations and exactly the same kinematical cuts as in the original analysis. Typically, most experimental effort goes into simulating the standard model background, which has already been done by the original analysis. Thus, simulating the new signal and producing limits on the production cross section of the new model would be a matter of seconds. At the same time, the impact of the original analyses could be tremendously expanded.
There is some hope that RECAST may click with experimentalists. First of all, it does not put a lot of additional burden on collaborations. For a given analysis, it only requires a one-time effort of interfacing it into RECAST (and one could imagine that at some point this step could be automatized too). The returns for this additional work would be a higher exposure of the analysis, which means more citations, which means more fame, more job offers, more money, more women... At the same time, RECAST ensures that no infidel hands ever touch the raw data. Finally, RECAST is not designed as a discovery tool, so the collaborations would keep the monopoly on that most profitable part of the business. All in all, lots of profits for a small price. Will it be enough to overcome the inertia? For the moment the only analysis available in the RECAST format is the search for Higgs decaying into 4 tau leptons performed recently by the ALEPH collaboration. For the program to kick off more analyses have to be incorporated. That depends on you....
Come visit the RECAST web page and tell the authors what you think about their proposal. See also another report, more in a this-will-never-happen vein.