Testwiki:Reference desk/Archives/Mathematics/2013 November 3

From testwiki
Jump to navigation Jump to search

Template:Error:not substituted

{| width = "100%"

|- ! colspan="3" align="center" | Mathematics desk |- ! width="20%" align="left" | < November 2 ! width="25%" align="center"|<< Oct | November | Dec >> ! width="20%" align="right" |Current desk > |}

Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


November 3

Entropy of uniform distribution

Melbourne, Australia

Looking at the entries for uniform distribution the entropy is given as ln(n) for the discrete case, and ln(b-a) for the continuous case. For a normalised distribution in the interval [0,1] the discrete case agrees with Shannon's definition of entropy, but the continuous case produces entropy of 0. Is this correct? Rjeges (talk) 10:06, 3 November 2013 (UTC)

I think it would be more accurate to say that the entropy in the continuous case is undefined. Looie496 (talk) 14:30, 3 November 2013 (UTC)
From Differential entropy#Definition:
Let X be a random variable with a probability density function f whose support is a set 𝕏. The differential entropy h(X) or h(f) is defined as
h(X)=𝕏f(x)logf(x)dx.
and
One must take care in trying to apply properties of discrete entropy to differential entropy, since probability density functions can be greater than 1. For example, Uniform(0,1/2) has negative differential entropy
0122log(2)dx=log(2).
Thus, differential entropy does not share all properties of discrete entropy.
Duoduoduo (talk) 15:58, 3 November 2013 (UTC)