Google DeepMind’s AI breakthrough has the potential to revolutionize the development of chips and batteries

AI researchers have forecasted the structures of over 2 million novel materials and released a compilation of the 381,000 most stable ones. They assert that their efforts currently represent an accumulation of knowledge equivalent to nearly 800 years.

Earlier this year, a South Korean laboratory unveiled a groundbreaking discovery touted as the “holy grail” of electric efficiency, offering a potential solution to the energy crisis. Scientists at Korea University in Seoul presented chunks of a grey-black polycrystalline compound called LK-99, asserting that it could be combined to create a superconductor operating at room temperature and normal pressure. However, these claims did not withstand subsequent scrutiny.

With the considerable enthusiasm sparked by the potential of a single breakthrough material, a recent announcement in late November by researchers at Google DeepMind could have enormous implications.

These researchers employed artificial intelligence (AI) to forecast the structures of over 2 million new materials, marking a breakthrough with broad applications in sectors such as renewable energy, battery research, semiconductor design, and computing efficiency.

Accelerating materials discovery with AI:

About 20,000 of the crystals experimentally identified in the ICSD database are computationally stable. Computational approaches drawing from the Materials Project, Open Quantum Materials Database and WBM database boosted this number to 48,000 stable crystals. GNoME expands the number of stable materials known to humanity to 421,000.

The utilization of DeepMind’s AI tool, Graph Networks for Materials Exploration (GNoME), makes the design and generation of potential recipes for new materials appear significantly more straightforward.

Why is this significant?

In a single stride, this AI-driven breakthrough expands the roster of ‘stable materials’ known to humanity by ten times. These materials encompass inorganic crystals crucial for various modern tech applications, ranging from computer chips to batteries.

For emerging technologies, it is crucial for crystals to exhibit stability, preventing potential decomposition. Although these materials will undergo synthesis and testing, DeepMind has released a list of 381,000 crystal structures out of the predicted 2.2 million, identified as the most stable.

DeepMind’s AI-driven discovery scales up the material selection process by employing filters to refine a list of synthesizable materials that could potentially meet specified requirements. Moreover, it has the capability to delve into predictions at the atomic bond level.

In the past few decades, human experimentation has unveiled the structures of approximately 28,000 stable materials, cataloged in the Inorganic Crystal Structures Database – the most extensive repository of identified materials.

How does GNoME actually work?

In a blog post introducing the project, DeepMind researchers provided details on the computational model’s design. GNoME is an advanced graph neural network model (GNN), with input data structured as a graph representing connections between atoms.

GNoME underwent training using “active learning,” a technique to expand a model initially trained on a small specialized dataset. Subsequently, developers can introduce new targets, enabling machine learning to label new data with human assistance. This characteristic renders the algorithm well-suited for the scientific task of discovering new materials, involving the identification of patterns not present in the original dataset.

GNoME was initially trained on crystal structure data from The Materials Project, a collaborative initiative spanning multiple institutions and nations aimed at computing the properties of all inorganic materials and providing data freely to materials researchers.

DeepMind’s Amil Merchant and Ekin Dogus Cubuk explained in a November 29 blog post that GNoME was utilized to generate novel candidate crystals and predict their stability.

They assessed the model’s predictive power through progressive training cycles, consistently evaluating its performance using established computational techniques such as Density Functional Theory (DFT) employed in physics, chemistry, and materials science to comprehend atomic structures, crucial for assessing crystal stability.

Leave a Comment