Enabling robust offline active learning for machine learning potentials using simple physics-based priors. Shuaibi, M., Sivakumar, S., Chen, R. Q., & Ulissi, Z. W Machine Learning: Science and Technology, 12, 2020.
Enabling robust offline active learning for machine learning potentials using simple physics-based priors [link]Paper  abstract   bibtex   9 downloads  
Machine learning surrogate models for quantum mechanical simulations has enabled the field to efficiently and accurately study material and molecular systems. Developed models typically rely on a substantial amount of data to make reliable predictions of the potential energy landscape or careful active learning and uncertainty estimates. When starting with small datasets, convergence of active learning approaches is a major outstanding challenge which limited most demonstrations to online active learning. In this work we demonstrate a Δ-machine learning approach that enables stable convergence in offline active learning strategies by avoiding unphysical configurations with initial datasets as little as a single data point. We demonstrate our framework's capabilities on a structural relaxation, transition state calculation, and molecular dynamics simulation, with the number of first principle calculations being cut down anywhere from 70-90%. The approach is incorporated and developed alongside AMPtorch, an open-source machine learning potential package, along with interactive Google Colab notebook examples.
@article{10.1088/2632-2153/abcc44,
	author={Muhammed Shuaibi and Saurabh Sivakumar and Rui Qi Chen and Zachary W Ulissi},
	title={Enabling robust offline active learning for machine learning potentials using simple physics-based priors},
	journal={Machine Learning: Science and Technology},
	url={http://iopscience.iop.org/article/10.1088/2632-2153/abcc44},
	year={2020},
	  month={12},
	abstract={Machine learning surrogate models for quantum mechanical simulations has enabled the field to efficiently and accurately study material and molecular systems. Developed models typically rely on a substantial amount of data to make reliable predictions of the potential energy landscape or careful active learning and uncertainty estimates. When starting with small datasets, convergence of active learning approaches is a major outstanding challenge which limited most demonstrations to online active learning. In this work we demonstrate a Δ-machine learning approach that enables stable convergence in offline active learning strategies by avoiding unphysical configurations with initial datasets as little as a single data point. We demonstrate our framework's capabilities on a structural relaxation, transition state calculation, and molecular dynamics simulation, with the number of first principle calculations being cut down anywhere from 70-90\%. The approach is incorporated and developed alongside AMP\textit{torch}, an open-source machine learning potential package, along with interactive Google Colab notebook examples.}
}

Downloads: 9