A Lawrence Livermore National Laboratory computational scientist, along with collaborators at the University of Massachusetts, Dartmouth and the University of Mississippi, has discovered a machine learning-based technique capable of automatically deriving a mathematical model for the motion of binary black holes from raw gravitational wave data, requiring only the computing power of a laptop.
The work is described in "Learning Orbital Dynamics of Binary Black Hole Systems from Gravitational Wave Measurements," published in the journal Physical Review Research.
Working backward using gravitational wave data from numerical relativity simulations, the team designed an algorithm that could learn the differential equations describing the dynamics of merging black holes for a range of cases. The approach outputs an equation in a few minutes to an hour.
"Machine learning will tell us what the equations are automatically," says Brendan Keith, a postdoctoral researcher at LLNL's Center for Applied Scientific Computing. "And that equation might be as accurate as something a person had been working on for 10-to-20 years."
From Lawrence Livermore National Laboratory
View Full Article
No entries found