Remove ads
From Wikipedia, the free encyclopedia
In information theory, the entropy power inequality (EPI) is a result that relates to so-called "entropy power" of random variables. It shows that the entropy power of suitably well-behaved random variables is a superadditive function. The entropy power inequality was proved in 1948 by Claude Shannon in his seminal paper "A Mathematical Theory of Communication". Shannon also provided a sufficient condition for equality to hold; Stam (1959) showed that the condition is in fact necessary.
For a random vector X : Ω → Rn with probability density function f : Rn → R, the differential entropy of X, denoted h(X), is defined to be
and the entropy power of X, denoted N(X), is defined to be
In particular, N(X) = |K| 1/n when X is normal distributed with covariance matrix K.
Let X and Y be independent random variables with probability density functions in the Lp space Lp(Rn) for some p > 1. Then
Moreover, equality holds if and only if X and Y are multivariate normal random variables with proportional covariance matrices.
The entropy power inequality can be rewritten in an equivalent form that does not explicitly depend on the definition of entropy power (see Costa and Cover reference below).
Let X and Y be independent random variables, as above. Then, let X' and Y' be independently distributed random variables with gaussian distributions, such that
Then,
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.