3
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Differential Privacy Dynamics of Langevin Diffusion and Noisy Gradient Descent

      Preprint
      , ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We model the dynamics of privacy loss in Langevin diffusion and extend it to the noisy gradient descent algorithm: we compute a tight bound on R\'enyi differential privacy and the rate of its change throughout the learning process. We prove that the privacy loss converges exponentially fast. This significantly improves the prior privacy analysis of differentially private (stochastic) gradient descent algorithms, where (R\'enyi) privacy loss constantly increases over the training iterations. Unlike composition-based methods in differential privacy, our privacy analysis does not assume that the noisy gradients (or parameters) during the training could be revealed to the adversary. Our analysis tracks the dynamics of privacy loss through the algorithm's intermediate parameter distributions, thus allowing us to account for privacy amplification due to convergence. We prove that our privacy analysis is tight, and also provide a utility analysis for strongly convex, smooth and Lipshitz loss functions.

          Related collections

          Author and article information

          Journal
          11 February 2021
          Article
          2102.05855
          654f74dd-b08b-4fa7-898a-091c4a0f876f

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          stat.ML cs.CR cs.LG

          Security & Cryptology,Machine learning,Artificial intelligence
          Security & Cryptology, Machine learning, Artificial intelligence

          Comments

          Comment on this article