Log sum inequality

From testwiki
Revision as of 17:59, 8 May 2024 by imported>Joyous! (Reverted 1 edit by 2402:E280:411C:56:E5DF:5CAE:2AE4:800A (talk) to last revision by Joyous!)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

The log sum inequality is used for proving theorems in information theory.

Statement

Let a1,,an and b1,,bn be nonnegative numbers. Denote the sum of all ais by a and the sum of all bis by b. The log sum inequality states that

i=1nailogaibialogab,

with equality if and only if aibi are equal for all i, in other words ai=cbi for all i.Template:Sfnp

(Take ailogaibi to be 0 if ai=0 and if ai>0,bi=0. These are the limiting values obtained as the relevant number tends to 0.)Template:Sfnp

Proof

Notice that after setting f(x)=xlogx we have

i=1nailogaibi=i=1nbif(aibi)=bi=1nbibf(aibi)bf(i=1nbibaibi)=bf(1bi=1nai)=bf(ab)=alogab,

where the inequality follows from Jensen's inequality since bib0, i=1nbib=1, and f is convex.Template:Sfnp

Generalizations

The inequality remains valid for n= provided that a< and b<.Template:Cn The proof above holds for any function g such that f(x)=xg(x) is convex, such as all continuous non-decreasing functions. Generalizations to non-decreasing functions other than the logarithm is given in Csiszár, 2004.

Another generalization is due to Dannan, Neff and Thiel, who showed that if a1,a2an and b1,b2bn are positive real numbers with a1+a2+an=a and b1+b2+bn=b, and k0, then i=1nailog(aibi+k)alog(ab+k). [1]

Applications

The log sum inequality can be used to prove inequalities in information theory. Gibbs' inequality states that the Kullback-Leibler divergence is non-negative, and equal to zero precisely if its arguments are equal.Template:Sfnp One proof uses the log sum inequality.

The inequality can also prove convexity of Kullback-Leibler divergence.Template:Sfnp

Notes

Template:Reflist

References