15
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Large Data and Zero Noise Limits of Graph-Based Semi-Supervised Learning Algorithms

      Preprint

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Scalings in which the graph Laplacian approaches a differential operator in the large graph limit are used to develop understanding of a number of algorithms for semi-supervised learning; in particular the extension, to this graph setting, of the probit algorithm, level set and kriging methods, are studied. Both optimization and Bayesian approaches are considered, based around a regularizing quadratic form found from an affine transformation of the Laplacian, raised to a, possibly fractional, exponent. Conditions on the parameters defining this quadratic form are identified under which well-defined limiting continuum analogues of the optimization and Bayesian semi-supervised learning problems may be found, thereby shedding light on the design of algorithms in the large graph setting. The large graph limits of the optimization formulations are tackled through \(\Gamma\)-convergence, using the recently introduced \(TL^p\) metric. The small labelling noise limit of the Bayesian formulations are also identified, and contrasted with pre-existing harmonic function approaches to the problem.

          Related collections

          Most cited references12

          • Record: found
          • Abstract: not found
          • Article: not found

          Normalized cuts and image segmentation

            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            A Tutorial on Spectral Clustering

            In recent years, spectral clustering has become one of the most popular modern clustering algorithms. It is simple to implement, can be solved efficiently by standard linear algebra software, and very often outperforms traditional clustering algorithms such as the k-means algorithm. On the first glance spectral clustering appears slightly mysterious, and it is not obvious to see why it works at all and what it really does. The goal of this tutorial is to give some intuition on those questions. We describe different graph Laplacians and their basic properties, present the most common spectral clustering algorithms, and derive those algorithms from scratch by several different approaches. Advantages and disadvantages of the different spectral clustering algorithms are discussed.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              The spectral function of an elliptic operator

                Bookmark

                Author and article information

                Journal
                23 May 2018
                Article
                1805.09450
                f7c0b92b-c513-450a-bd18-4fb19b8a75a2

                http://arxiv.org/licenses/nonexclusive-distrib/1.0/

                History
                Custom metadata
                62G20, 62C10, 62F15, 49J55
                stat.ML cs.LG math.AP

                Analysis,Machine learning,Artificial intelligence
                Analysis, Machine learning, Artificial intelligence

                Comments

                Comment on this article