![]() Because of their computational efficiency and scalability to large datasets, deep neural networks (DNNs) are a particularly promising ML algorithm for chemical applications. After being trained on appropriate ab initio reference data, these methods allow to accurately predict the properties of chemical systems, circumventing the need for explicitly solving the electronic Schrödinger equation. In recent years, machine learning (ML) methods have become increasingly popular in computational chemistry. We encourage evaluating materials ML algorithms on the Matbench benchmark and comparing them against the latest version of Automatminer. We also show our test suite is capable of exposing predictive advantages of each algorithm-namely, that crystal graph methods appear to outperform traditional machine learning methods given ~104 or greater data points. We find Automatminer achieves the best performance on 8 of 13 tasks in the benchmark. We test Automatminer on the Matbench test suite and compare its predictive power with state-of-the-art crystal graph neural networks and a traditional descriptor-based Random Forest model. The reference algorithm, Automatminer, is a highly-extensible, fully automated ML pipeline for predicting materials properties from materials primitives (such as composition and crystal structure) without user intervention or hyperparameter tuning. Tasks include predicting optical, thermal, electronic, thermodynamic, tensile, and elastic properties given a material’s composition and/or crystal structure. The test suite, Matbench, is a set of 13 ML tasks that range in size from 312 to 132k samples and contain data from 10 density functional theory-derived and experimental sources. We present a benchmark test suite and an automated machine learning procedure for evaluating supervised machine learning (ML) models for predicting properties of inorganic bulk materials. The presented AtomSets model framework can potentially accelerate machine learning-assisted materials design and discovery with less data restriction. The models require minimal domain knowledge inputs and are free from feature engineering. They also transfer better in a simulated materials discovery process where the targeted materials have property values out of the training data limits. The AtomSets models show lower errors than the graph network models at small data limits and other non-deep-learning models at large data limits. Here we develop the AtomSets framework, which utilizes universal compositional and structural descriptors extracted from pre-trained graph network deep learning models with standard multi-layer perceptrons to achieve consistently high model accuracy for both small compositional data (130,000). However, deep learning models suffer in the small data regime that is common in materials science. Deep learning has recently garnered considerable interest in materials predictive tasks with low model errors when dealing with large materials data. Predicting properties from a material’s composition or structure is of great interest for materials design.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |