32
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Solving the Quantum Many-Body Problem with Artificial Neural Networks

      Preprint
      ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The challenge posed by the many-body problem in quantum physics originates from the difficulty of describing the non-trivial correlations encoded in the exponential complexity of the many-body wave function. Here we demonstrate that systematic machine learning of the wave function can reduce this complexity to a tractable computational form, for some notable cases of physical interest. We introduce a variational representation of quantum states based on artificial neural networks with variable number of hidden neurons. A reinforcement-learning scheme is then demonstrated, capable of either finding the ground-state or describing the unitary time evolution of complex interacting quantum systems. We show that this approach achieves very high accuracy in the description of equilibrium and dynamical properties of prototypical interacting spins models in both one and two dimensions, thus offering a new powerful tool to solve the quantum many-body problem.

          Related collections

          Most cited references2

          • Record: found
          • Abstract: found
          • Article: not found

          Representational power of restricted boltzmann machines and deep belief networks.

          Deep belief networks (DBN) are generative neural network models with many layers of hidden explanatory factors, recently introduced by Hinton, Osindero, and Teh (2006) along with a greedy layer-wise unsupervised learning algorithm. The building block of a DBN is a probabilistic model called a restricted Boltzmann machine (RBM), used to represent one layer of the model. Restricted Boltzmann machines are interesting because inference is easy in them and because they have been successfully used as building blocks for training deeper models. We first prove that adding hidden units yields strictly improved modeling power, while a second theorem shows that RBMs are universal approximators of discrete distributions. We then study the question of whether DBNs with more layers are strictly more powerful in terms of representational power. This suggests a new and less greedy criterion for training RBMs within DBNs.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            A structural approach to relaxation in glassy liquids

              Bookmark

              Author and article information

              Journal
              2016-06-07
              Article
              1606.02318
              37db3dd4-8724-4a47-ba2b-90b1c36a7e06

              http://arxiv.org/licenses/nonexclusive-distrib/1.0/

              History
              Custom metadata
              cond-mat.dis-nn cond-mat.quant-gas quant-ph

              Quantum physics & Field theory,Quantum gases & Cold atoms,Theoretical physics

              Comments

              Comment on this article