58
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Machine Learning Force Fields

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          In recent years, the use of machine learning (ML) in computational chemistry has enabled numerous advances previously out of reach due to the computational complexity of traditional electronic-structure methods. One of the most promising applications is the construction of ML-based force fields (FFs), with the aim to narrow the gap between the accuracy of ab initio methods and the efficiency of classical FFs. The key idea is to learn the statistical relation between chemical structure and potential energy without relying on a preconceived notion of fixed chemical bonds or knowledge about the relevant interactions. Such universal ML approximations are in principle only limited by the quality and quantity of the reference data used to train them. This review gives an overview of applications of ML-FFs and the chemical insights that can be obtained from them. The core concepts underlying ML-FFs are described in detail, and a step-by-step guide for constructing and testing them from scratch is given. The text concludes with a discussion of the challenges that remain to be overcome by the next generation of ML-FFs.

          Related collections

          Most cited references351

          • Record: found
          • Abstract: not found
          • Article: not found

          Generalized Gradient Approximation Made Simple

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Deep learning.

            Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in speech recognition, visual object recognition, object detection and many other domains such as drug discovery and genomics. Deep learning discovers intricate structure in large data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer. Deep convolutional nets have brought about breakthroughs in processing images, video, speech and audio, whereas recurrent nets have shone light on sequential data such as text and speech.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              A consistent and accurate ab initio parametrization of density functional dispersion correction (DFT-D) for the 94 elements H-Pu.

              The method of dispersion correction as an add-on to standard Kohn-Sham density functional theory (DFT-D) has been refined regarding higher accuracy, broader range of applicability, and less empiricism. The main new ingredients are atom-pairwise specific dispersion coefficients and cutoff radii that are both computed from first principles. The coefficients for new eighth-order dispersion terms are computed using established recursion relations. System (geometry) dependent information is used for the first time in a DFT-D type approach by employing the new concept of fractional coordination numbers (CN). They are used to interpolate between dispersion coefficients of atoms in different chemical environments. The method only requires adjustment of two global parameters for each density functional, is asymptotically exact for a gas of weakly interacting neutral atoms, and easily allows the computation of atomic forces. Three-body nonadditivity terms are considered. The method has been assessed on standard benchmark sets for inter- and intramolecular noncovalent interactions with a particular emphasis on a consistent description of light and heavy element systems. The mean absolute deviations for the S22 benchmark set of noncovalent interactions for 11 standard density functionals decrease by 15%-40% compared to the previous (already accurate) DFT-D version. Spectacular improvements are found for a tripeptide-folding model and all tested metallic systems. The rectification of the long-range behavior and the use of more accurate C(6) coefficients also lead to a much better description of large (infinite) systems as shown for graphene sheets and the adsorption of benzene on an Ag(111) surface. For graphene it is found that the inclusion of three-body terms substantially (by about 10%) weakens the interlayer binding. We propose the revised DFT-D method as a general tool for the computation of the dispersion energy in molecules and solids of any kind with DFT and related (low-cost) electronic structure methods for large systems.
                Bookmark

                Author and article information

                Journal
                Chem Rev
                Chem Rev
                cr
                chreay
                Chemical Reviews
                American Chemical Society
                0009-2665
                1520-6890
                11 March 2021
                25 August 2021
                : 121
                : 16 , Machine Learning at the Atomic Scale
                : 10142-10186
                Affiliations
                []Machine Learning Group, Technische Universität Berlin , 10587 Berlin, Germany
                []DFG Cluster of Excellence “Unifying Systems in Catalysis” (UniSysCat), Technische Universität Berlin , 10623 Berlin, Germany
                [§ ]BASLEARN, BASF-TU Joint Lab, Technische Universität Berlin , 10587 Berlin, Germany
                []Department of Physics and Materials Science, University of Luxembourg , L-1511 Luxembourg City, Luxembourg
                []BIFOLD−Berlin Institute for the Foundations of Learning and Data , Berlin, Germany
                []Department of Artificial Intelligence, Korea University , Anam-dong, Seongbuk-gu, Seoul 02841, Korea
                [# ]Max Planck Institute for Informatics, Stuhlsatzenhausweg , 66123 Saarbrücken, Germany
                []Google Research, Brain Team , Berlin, Germany
                Author notes
                Author information
                http://orcid.org/0000-0001-7503-406X
                http://orcid.org/0000-0001-6091-3408
                http://orcid.org/0000-0002-3188-7017
                http://orcid.org/0000-0001-8342-0964
                http://orcid.org/0000-0002-1012-4854
                http://orcid.org/0000-0002-3861-7685
                Article
                10.1021/acs.chemrev.0c01111
                8391964
                33705118
                a049da51-f90c-4653-8cf4-22564f910a5c
                © 2021 The Authors. Published by American Chemical Society

                Permits non-commercial access and re-use, provided that author attribution and integrity are maintained; but does not permit creation of adaptations or other derivative works ( https://creativecommons.org/licenses/by-nc-nd/4.0/).

                History
                : 12 October 2020
                Funding
                Funded by: H2020 European Research Council, doi 10.13039/100010663;
                Award ID: NA
                Funded by: Korea University, doi 10.13039/501100002642;
                Award ID: 019-0-00079
                Funded by: Institute for Information and Communications Technology Promotion, doi 10.13039/501100010418;
                Award ID: 2017-0-00451
                Funded by: Bundesministerium für Bildung und Forschung, doi 10.13039/501100002347;
                Award ID: 031L0207D
                Funded by: Bundesministerium für Bildung und Forschung, doi 10.13039/501100002347;
                Award ID: 01IS18037A
                Funded by: Bundesministerium für Bildung und Forschung, doi 10.13039/501100002347;
                Award ID: 01IS18025A
                Funded by: Bundesministerium für Bildung und Forschung, doi 10.13039/501100002347;
                Award ID: 01IS14013A-E
                Funded by: Bundesministerium für Bildung und Forschung, doi 10.13039/501100002347;
                Award ID: 01GQ1115
                Funded by: Bundesministerium für Bildung und Forschung, doi 10.13039/501100002347;
                Award ID: 01GQ0850
                Funded by: Schweizerischer Nationalfonds zur Förderung der Wissenschaftlichen Forschung, doi 10.13039/501100001711;
                Award ID: P2BSP2_188147
                Funded by: Deutsche Forschungsgemeinschaft, doi 10.13039/501100001659;
                Award ID: Math+
                Funded by: Deutsche Forschungsgemeinschaft, doi 10.13039/501100001659;
                Award ID: EXC 2046/1
                Categories
                Review
                Custom metadata
                cr0c01111
                cr0c01111

                Chemistry
                Chemistry

                Comments

                Comment on this article