4th July 2012
It’s thirty-five times heavier than other quarks, and as heavy as the nucleus of an atom of gold. The top quark is the elephant-in-the-room of the standard model. Its lifetime in particle detectors is so short that we can only infer its existence by sifting through vast quantities of data produced by smashing together subatomic particles. “Without distributed computing, hunting for the top quark would be impossible,” said Marcel Vreeswijk, associate professor of subatomic physics at the University of Amsterdam, who works on the ATLAS experiment at the Large Hadron Collider (LHC) at CERN, in Switzerland.
ATLAS produces new data every 25 nanoseconds (billionths of a second), helping scientists home in on an accurate mass for the top quark. To deal with this data, some of which is random ‘noise’, they have to filter out signals at the detector stage. Even so, processor farms must reconstruct collision events hundreds of times per second, after which data is sent out to the grid for further processing. “There are many people in different countries looking at the results,” Vreeswijk said.
The mystery is in the mass
By simulating the high-energy conditions of the early universe, particle accelerators enable physicists to investigate the fundamental constituents of matter. Protons and neutrons in atoms are not the most basic building blocks, but are themselves made of a combination of ‘up’ and ‘down’ quarks – three in total. There are also four other, more exotic, types of quark besides ‘up’ and ‘down’ – including the top quark – that contribute to a picture of particle physics called the standard model. Top quarks were abundant in the early universe, but now, even in the world’s most powerful accelerator at the LHC, they decay immediately.
“In nature we don’t see quarks on their own because they’re unstable. In our detectors, we see a signature shower of particles, including electrons, protons, pions, and muons,” Vreeswijk said. “The top quark decays immediately into a bottom quark and a W boson – and the W boson decays into other particles. From these decays we know the top quark mass more accurately than any other particle.”
The top and the higgs
Physicists think that particles, including the top quark, acquire their mass by interacting with the Higgs field. Knowing the mass of the top quark accurately is important because it can be used as a ‘standard’ in particle detector collisions, and will help point to the existence of the Higgs boson. Finding it, however, won’t be the end for physicists like Vreeswijk. “If we find the Higgs particle, that won’t be enough. We also have to test it against the standard model predictions,” Vreeswijk said.
At high energies, inconsistencies begin to appear, such as deviations of recorded masses from those predicted by theory, or experimental results not matching up with Monte Carlo simulations. “When you have inconsistencies like this, we know the standard model cannot be the model. There must be something else,” Vreeswijk said.
This opens up the possibility of new physics. One theory is supersymmetry, where each of the fundamental particles is mirrored by a partner with heavier mass. “It would complete the standard model in an elegant way,” said Vreeswijk.
There are even more exotic possibilities, such as extra dimensions, occupied by tiny black holes. “It sounds bizarre, but actually it’s based on solid theory,” Vreeswijk said.
Beyond the top quark
People are already working on the next generation of particle accelerators – such as CLIC at CERN – which will reach even higher energies. “They will produce even more data, so this brings us back to distributed computing,” Vreeswijk said. Even now, we don’t have enough power to process everything every 25 nanoseconds. So the next generation of faster computers will need to be able to process more data, without the need for filtering at the detector stage. This will allow us to investigate rare processes, which would otherwise be lost.
Vreeswijk also pointed out the importance of investing in computing infrastructure. “There is a secondary benefit to distributed computing for data analysis,” Vreeswijk said. “Local scientific institutes in individual countries benefit from extra computer power, while avoiding overloading a central facility such as the one at CERN.”