Testwiki:Reference desk/Archives/Mathematics/2013 November 3
From testwiki
Revision as of 06:16, 23 February 2022 by imported>MalnadachBot (Fixed Lint errors. (Task 12))
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Template:Error:not substituted
{| width = "100%"
|- ! colspan="3" align="center" | Mathematics desk |- ! width="20%" align="left" | < November 2 ! width="25%" align="center"|<< Oct | November | Dec >> ! width="20%" align="right" |Current desk > |}
| Welcome to the Wikipedia Mathematics Reference Desk Archives |
|---|
| The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
November 3
Entropy of uniform distribution
Melbourne, Australia
Looking at the entries for uniform distribution the entropy is given as ln(n) for the discrete case, and ln(b-a) for the continuous case. For a normalised distribution in the interval [0,1] the discrete case agrees with Shannon's definition of entropy, but the continuous case produces entropy of 0. Is this correct? Rjeges (talk) 10:06, 3 November 2013 (UTC)
- I think it would be more accurate to say that the entropy in the continuous case is undefined. Looie496 (talk) 14:30, 3 November 2013 (UTC)
- Let X be a random variable with a probability density function f whose support is a set . The differential entropy h(X) or h(f) is defined as
- .
- and
- One must take care in trying to apply properties of discrete entropy to differential entropy, since probability density functions can be greater than 1. For example, Uniform(0,1/2) has negative differential entropy
- .
- Thus, differential entropy does not share all properties of discrete entropy.