Blog
About

  • Record: found
  • Abstract: found
  • Article: found
Is Open Access

Computing low-rank approximations of large-scale matrices with the Tensor Network randomized SVD

Preprint

Read this article at

Bookmark
      There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

      Abstract

      We propose a new algorithm for the computation of a singular value decomposition (SVD) low-rank approximation of a matrix in the Matrix Product Operator (MPO) format, also called the Tensor Train Matrix format. Our tensor network randomized SVD (TNrSVD) algorithm is an MPO implementation of the randomized SVD algorithm that is able to compute dominant singular values and their corresponding singular vectors. In contrast to the state-of-the-art tensor-based alternating least squares SVD (ALS-SVD) and modified alternating least squares SVD (MALS-SVD) matrix approximation methods, TNrSVD can be up to 17 times faster while achieving the same accuracy. In addition, our TNrSVD algorithm also produces accurate approximations in particular cases where both ALS-SVD and MALS-SVD fail to converge. We also propose a new algorithm for the fast conversion of a sparse matrix into its corresponding MPO form, which is up to 509 times faster than the standard Tensor Train SVD (TT-SVD) method while achieving machine precision accuracy. The efficiency and accuracy of both algorithms are demonstrated in numerical experiments.

      Related collections

      Most cited references 17

      • Record: found
      • Abstract: not found
      • Article: not found

      Tensor Decompositions and Applications

        Bookmark
        • Record: found
        • Abstract: not found
        • Article: not found

        Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions

          Bookmark
          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          The density-matrix renormalization group in the age of matrix product states

          The density-matrix renormalization group method (DMRG) has established itself over the last decade as the leading method for the simulation of the statics and dynamics of one-dimensional strongly correlated quantum lattice systems. In the further development of the method, the realization that DMRG operates on a highly interesting class of quantum states, so-called matrix product states (MPS), has allowed a much deeper understanding of the inner structure of the DMRG method, its further potential and its limitations. In this paper, I want to give a detailed exposition of current DMRG thinking in the MPS language in order to make the advisable implementation of the family of DMRG algorithms in exclusively MPS terms transparent. I then move on to discuss some directions of potentially fruitful further algorithmic development: while DMRG is a very mature method by now, I still see potential for further improvements, as exemplified by a number of recently introduced algorithms.
            Bookmark

            Author and article information

            Journal
            24 July 2017
            1707.07803

            http://arxiv.org/licenses/nonexclusive-distrib/1.0/

            Custom metadata
            math.NA cs.NA

            Comments

            Comment on this article