Blog
About

14
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Quantum-chemical insights from deep tensor neural networks

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Learning from data has led to paradigm shifts in a multitude of disciplines, including web, text and image search, speech recognition, as well as bioinformatics. Can machine learning enable similar breakthroughs in understanding quantum many-body systems? Here we develop an efficient deep learning approach that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems. We unify concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate (1 kcal mol −1) predictions in compositional and configurational chemical space for molecules of intermediate size. As an example of chemical relevance, the model reveals a classification of aromatic rings with respect to their stability. Further applications of our model for predicting atomic energies and local chemical potentials in molecules, reliable isomer energies, and molecules with peculiar electronic structure demonstrate the potential of machine learning for revealing insights into complex quantum-chemical systems.

          Abstract

          Machine learning is an increasingly popular approach to analyse data and make predictions. Here the authors develop a ‘deep learning' framework for quantitative predictions and qualitative understanding of quantum-mechanical observables of chemical systems, beyond properties trivially contained in the training data.

          Related collections

          Most cited references 15

          • Record: found
          • Abstract: not found
          • Article: not found

          Generalized Gradient Approximation Made Simple.

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Battery materials for ultrafast charging and discharging.

            The storage of electrical energy at high charge and discharge rate is an important technology in today's society, and can enable hybrid and plug-in hybrid electric vehicles and provide back-up for wind and solar energy. It is typically believed that in electrochemical systems very high power rates can only be achieved with supercapacitors, which trade high power for low energy density as they only store energy by surface adsorption reactions of charged species on an electrode material. Here we show that batteries which obtain high energy density by storing charge in the bulk of a material can also achieve ultrahigh discharge rates, comparable to those of supercapacitors. We realize this in LiFePO(4) (ref. 6), a material with high lithium bulk mobility, by creating a fast ion-conducting surface phase through controlled off-stoichiometry. A rate capability equivalent to full battery discharge in 10-20 s can be achieved.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The graph neural network model.

              Many underlying relationships among data in several areas of science and engineering, e.g., computer vision, molecular chemistry, molecular biology, pattern recognition, and data mining, can be represented in terms of graphs. In this paper, we propose a new neural network model, called graph neural network (GNN) model, that extends existing neural network methods for processing the data represented in graph domains. This GNN model, which can directly process most of the practically useful types of graphs, e.g., acyclic, cyclic, directed, and undirected, implements a function tau(G,n) is an element of IR(m) that maps a graph G and one of its nodes n into an m-dimensional Euclidean space. A supervised learning algorithm is derived to estimate the parameters of the proposed GNN model. The computational cost of the proposed algorithm is also considered. Some experimental results are shown to validate the proposed learning algorithm, and to demonstrate its generalization capabilities.
                Bookmark

                Author and article information

                Journal
                Nat Commun
                Nat Commun
                Nature Communications
                Nature Publishing Group
                2041-1723
                09 January 2017
                2017
                : 8
                Affiliations
                [1 ]Machine Learning Group, Technische Universität Berlin , Marchstr. 23, 10587 Berlin, Germany
                [2 ]Department of Brain and Cognitive Engineering, Korea University, Anam-dong, Seongbuk-gu , Seoul 136-713, Republic of Korea
                [3 ]Theory Department, Fritz-Haber-Institut der Max-Planck-Gesellschaft , Faradayweg 4-6, D-14195 Berlin, Germany
                [4 ]Physics and Materials Science Research Unit, University of Luxembourg, Luxembourg, , L-1511 Luxembourg
                Author notes
                Article
                ncomms13890
                10.1038/ncomms13890
                5228054
                28067221
                Copyright © 2017, The Author(s)

                This work is licensed under a Creative Commons Attribution 4.0 International License. The images or other third party material in this article are included in the article's Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder to reproduce the material. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

                Categories
                Article

                Uncategorized

                Comments

                Comment on this article