News
Artificial Intelligence and Machine Learning News

Artificial Intelligence for Materials Discovery

Finding novel materials needs more than pure machine learning.
Posted
  1. Introduction
  2. Combinatorial Explosion
  3. Deep Collaborations
  4. Beyond Deep Learning
  5. Author
3D chemical compounds floating in space, illustration

The software-driven successes of deep learning have been profound, but the real world is made of materials. Researchers are turning to artificial intelligence (AI) to help find new materials to provide better electronics and transportation, and the energy to run them.

Despite its undeniable power, however, “Machine learning, especially the deep learning revolution, relies heavily on large amounts of data,” said Carla Gomes, a computer scientist at Cornell University. “This is not how science works,” she said. “The key next step is for us to incorporate more and more reasoning capabilities and combine these vast amounts of data with reasoning capabilities.”

“Machine learning as we know it is not enough for scientific discovery,” she said. “We still have a long way to go.”

Nevertheless, researchers are off to a promising start in addressing materials science.

Back to Top

Combinatorial Explosion

One of the challenges in materials discovery is the astronomical number of compositions that might have interesting properties. “High-entropy alloys” (HEA), for example, combine four or more metals. “If you consider all the elements in the periodic table and you will find that you have many combinations, then infinite combinations of the different elements, so that makes prediction very difficult,” explained Ziyuan Rao, a postdoc at the Max Planck Institute for Iron Research in Düsseldorf, Germany.

Nonetheless, Rao and his colleagues created a multistage analysis to search for alloys with low thermal expansion, which are important for cryogenic storage of liquified natural gas and for other purposes. The analysis draws on extensive materials datasets, but the available compositions are a tiny, sparse subset of the universe of perhaps 1050 possibilities.

After training a machine-learning model with this data, the researchers used it to select promising candidates, often completely novel. They then used computationally intensive density-functional theory (DFT) calculations to get more precise estimates of each compound’s properties. DFT is a widely used shortcut around full quantum mechanical theory. In fact, researchers at DeepMind recently used deep learning to let DFT determine how electron charge is distributed between competing atoms, a longstanding challenge.

A key feature of the HEA search is active learning, which suggests new compositions to examine that will be most informative. “it’s a little different from traditional machine learning,” Rao said, which typically aims to increase the accuracy of the model. “We also want to use this model to predict new materials with very good properties.”

Indeed, Rao and his colleagues further refined their search by experimentally making and measuring some of the best candidates. “You need real-world data,” he said, because “Simulated data sometimes is inaccurate.” The experimental results are folded back into the modeling, and the loop is repeated six times. The study successfully identified two new alloy compositions with a tiny thermal expansion coefficient, less than two parts per million per degree.

Maria Chan of the Center for Nanoscale Materials at the U.S. Department of Energy’s Argonne National Laboratory, and Arun Kanakkithodi, now an assistant professor of Materials Engineering at Purdue University, performed a similar search for new halide perovskites, which have tremendous potential for solar cells. Unlike applications such as health imaging, “In materials and chemistry, there’s this advantage we have where we can generate a lot of data using quantum-mechanical simulations,” she said. “A lot of machine learning is done on simulated data, in which we have control over the coverage of different inputs, we have control over the size of the data.”


A key detail of the HEA search is active learning, which suggests new compositions to examine that will be most informative.


Once a model has been trained, it is important to learn if it generalizes to inputs that are “at least somewhat” beyond the training data, Chan said. “I think that is the least that we can do.” The researchers ultimately whittled a group of some 18,000 possible compounds down to a more promising set of 400, including some that had never been examined before.

One challenge, very much an active area of research, is how to train a system to improve multiple properties. “It’s not just one thing that you care about,” Chan said. “There are many outputs that are important,” such as perovskites’ stability, band gap, and defect tolerance. Although the attributes can be combined into a single metric, for example for reinforcement learning, she noted the weighting of different properties is extremely important.

In another research direction, “Machine learning and artificial intelligence really help” in materials characterization, Chan said, which is critical because structure is intimately related to properties. Techniques like microscopy and spectroscopy tend to be “inverse problems” that seek to determine what structure gives rise to the observations, a natural fit to tools that learn the relationship of input and output.

Back to Top

Deep Collaborations

These examples illustrate the growing impact of machine learning on the relationship between input conditions, such as material compositions, and output variables such as material properties. “The algorithm is good at that,” Chan said. “The scientific expertise is to figure out what’s the input and what’s the output,” Chan said. “Not many people realize that’s the hardest thing.”

Often, she said, “Blind machine learning is not as successful as what we call physics-informed machine learning,” which incorporates constraints, asymptotic behaviors, symmetries, and physical laws.

“When you only have limited data, you have to incorporate the physics,” agreed Anima Anandkumar, a professor of Computing and Mathematical Sciences at the California Institute of Technology (Caltech). Embedding these rules in the tools can make the computations much more efficient.

Anandkumar and her colleagues have developed “neural operators” for solving partial differential equations (PDEs), which govern aspects of materials, as well as fluid dynamics and climate science. “Many of these processes require a very fine grid. That’s what leads to these enormous computational requirements,” she said. Using an adjustable grid yields calculations as much as five orders of magnitude faster, but “still preserves the fine scales,” she said, which is “not possible with standard neural networks because they operate at fixed resolution.”

Anandkumar and her colleague Yisong Yue started the “AI4Science” initiative at Caltech (the name has been widely adopted for efforts elsewhere). There are many scientific problems that can be addressed with off-the-shelf tools, she said, so “AI scientists don’t need to be involved.” She noted that in many scientific problems, “The optimization landscape gets very hard, which in standard deep learning we don’t even worry about.”

“Where the standard tools don’t work, that’s where I think that has to be a very deep collaboration,” Anandkumar said. To make progress in these cases, it’s “really critical” to have people with “deep domain expertise in terms of how the current numerical solvers are applied … in these multiscale chaotic systems,” as well as AI experts to build a sophisticated framework “that has good generalization, that has the right inductive bias, that has the right constraints.”


“The scientific expertise is to figure out what’s the input and what’s the output. Not many people realize that’s the hardest part.”


“It takes a lot of patience” to make real progress applying AI to science, said John Gregoire, research professor of Applied Physics and Materials Science at Caltech, who has been collaborating with Gomes for more than a decade. The conference and publication pressures to benchmark their results against standard datasets can be “more of a barrier than a helpful solution,” especially for early-career researchers in computer science, he said. “Very few of them are willing to take on the challenge of talking with domain experts in a particular area of science to the extent that they can be impactful in that discipline.”

Back to Top

Beyond Deep Learning

“The direct applicability of various deep-learning techniques in materials discovery is just not that impactful,” Gregoire said. “We really need methods tailored toward the physical sciences.” Enormous databases of labeled images or data tables are “not what scientific prior knowledge looks like,” he said.

“We’re going to need new AI architectures to address specific problems in science,” Gregoire added. “There is no one architecture that’s going to solve everything.”

The ongoing question, Anandkumar said, is how “to develop better, more robust, more interpretable, guaranteed AI methods that really stand the test of these scientific applications.”

In some cases, successful approaches could combine neural networks with symbolic representations, for example, of mathematical relationships or other rules. “Any constraints you know from the system already come with symbols, because certain quantities need to be conserved,” she said. Combined neurosymbolic tools are “much more natural in the sciences,” Gomes said, than for “other cases where you don’t know what the constraints are.

“If you combine your observations with prior knowledge, scientific knowledge, and put them all together, you are able to generalize, as opposed to just relying on data,” she said. “Typically deep learning has [many] layers, but the layers don’t have a meaning, so incorporating prior knowledge is not trivial.”

She and her colleagues, including Gregoire, recently published a method they call “deep-reasoning nets.” A key element is the use of an encoder (similar to that used by Rao) that projects the input data into a lower dimensional space. “Contrarily to the standard, we assign an interpretation to this latent space.”

The techniques successfully identified phases in a mixture from their X-ray diffraction patterns by enforcing thermodynamic constraints, and also separated overlaid handwritten Sudoku solutions. Useful new methodologies should be “very general and applicable across several domains,” Gomes said, as are workhorse techniques such as linear programming or regression. “That is not just machine learning. That’s the beauty of computer science.”

*  Further Reading

Qing-Miao Hu and Rui Yang,
“The Endless Search for Better Alloys,” Science 378, 26. (2022), https://doi.org/10.1126/science.ade5503

Carla P. Gomes, Bart Selman, and John M. Gregoire,
“Artificial Intelligence for Materials Discovery,” MRS Bulletin 44, 538 (2019). https://doi.org/10.1557/mrs.2019.158

Heather J. Kulik and Pratyush Tiwary,
“Artificial Intelligence in Computational Materials Science,” MRS Bulletin 47, 927–929 (2022). https://doi.org/10.1557/s43577-022-00431-1

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More