On their quest to uncover what the universe is made of, researchers at the U.S. Department of Energy's Argonne National Laboratory are harnessing the power of supercomputers to make predictions about particle interactions that are more precise than ever before.
Argonne researchers have developed a new theoretical approach, ideally suited for high-performance computing systems, that is capable of making predictive calculations about particle interactions that conform almost exactly to experimental data. This new approach could give scientists a valuable tool for describing new physics and particles beyond those currently identified.
The framework makes predictions based on the Standard Model, the theory that describes the physics of the universe to the best of available knowledge. Researchers are now able to compare experimental data with predictions generated through this framework, to potentially uncover discrepancies that could indicate the existence of new physics beyond the Standard Model. Such a discovery would revolutionize the understanding of nature at the smallest measurable length scales.
"So far, the Standard Model of particle physics has been very successful in describing the particle interactions we have seen experimentally, but we know that there are things that this model doesn't describe completely," says Argonne theorist Radja Boughezal, who developed the framework with her team. "We don't know the full theory."
"The first step in discovering the full theory and new models involves looking for deviations with respect to the physics we know right now," Boughezal says. "Our hope is that there is deviation, because it would mean that there is something that we don't understand out there."
The theoretical method developed by the Argonne team is currently being deployed on Mira, one of the world's fastest supercomputers, which is housed at the Argonne Leadership Computing Facility, a DOE Office of Science User Facility.
Using Mira, researchers are applying the new framework to analyze the production of missing energy in association with a jet, a particle interaction of particular interest to researchers at the Large Hadron Collider (LHC) in Switzerland.
Physicists at the LHC are attempting to produce new particles that are known to exist in the universe but have yet to be seen in the laboratory, such as the dark matter that comprises a quarter of the mass and energy of the universe.
Although scientists have no way today of observing dark matter directly — hence its name — they believe that dark matter could leave a "missing energy footprint" in the wake of a collision that could indicate the presence of new particles not included in the Standard Model. These particles would interact very weakly and therefore escape detection at the LHC. The presence of a "jet," a spray of Standard Model particles arising from the break-up of the protons colliding at the LHC, would tag the presence of the otherwise invisible dark matter.
In the LHC detectors, however, the production of a particular kind of interaction — called the Z-boson plus jet process — can mimic the same signature as the potential signal that would arise from as-yet-unknown dark matter particles. Boughezal and her colleagues are using their new framework to help LHC physicists distinguish between the Z-boson plus jet signature predicted in the Standard Model from other potential signals.
Previous attempts using less precise calculations to distinguish the two processes had so much uncertainty that they were simply not useful for being able to draw the fine mathematical distinctions that could potentially identify a new dark matter signal.
"It is only by calculating the Z-boson plus jet process very precisely that we can determine whether the signature is indeed what the Standard Model predicts, or whether the data indicates the presence of something new," says Frank Petriello, another Argonne theorist who helped develop the framework. "This new framework opens the door to using Z-boson plus jet production as a tool to discover new particles beyond the Standard Model."
Applications for this method go well beyond studies of the Z-boson plus jet. The framework will impact not only research at the LHC, but also studies at future colliders which will have increasingly precise, high-quality data, Boughezal and Petriello say.
"These experiments have gotten so precise, and experimentalists are now able to measure things so well, that it's become necessary to have these types of high-precision tools in order to understand what's going on in these collisions," Boughezal says.
"We're also so lucky to have supercomputers like Mira because now is the moment when we need these powerful machines to achieve the level of precision we're looking for; without them, this work would not be possible."
Funding and resources for this work was previously allocated through the Argonne Leadership Computing Facility's Director's Discretionary program. Support for this work will continue through allocations coming from the Innovation and Novel Computational Impact on Theory and Experiment program.
No entries found