0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The entropic doubling constant and robustness of Gaussian codebooks for additive-noise channels

      Preprint
      , ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Entropy comparison inequalities are obtained for the differential entropy \(h(X+Y)\) of the sum of two independent random vectors \(X,Y\), when one is replaced by a Gaussian. For identically distributed random vectors \(X,Y\), these are closely related to bounds on the entropic doubling constant, which quantifies the entropy increase when adding an independent copy of a random vector to itself. Consequences of both large and small doubling are explored. For the former, lower bounds are deduced on the entropy increase when adding an independent Gaussian, while for the latter, a qualitative stability result for the entropy power inequality is obtained. In the more general case of non-identically distributed random vectors \(X,Y\), a Gaussian comparison inequality with interesting implications for channel coding is established: For additive-noise channels with a power constraint, Gaussian codebooks come within a \(\frac{{\sf snr}}{3{\sf snr}+2}\) factor of capacity. In the low-SNR regime this improves the half-a-bit additive bound of Zamir and Erez (2004). Analogous results are obtained for additive-noise multiple access channels, and for linear, additive-noise MIMO channels.

          Related collections

          Author and article information

          Journal
          11 March 2024
          Article
          2403.07209
          fb875372-f996-42cb-995d-f6e68f0d3683

          http://creativecommons.org/licenses/by/4.0/

          History
          Custom metadata
          23 pages, no figures
          cs.IT math.IT math.PR

          Numerical methods,Information systems & theory,Probability
          Numerical methods, Information systems & theory, Probability

          Comments

          Comment on this article