Nat (unit)

From testwiki
Revision as of 17:05, 11 January 2025 by imported>The Wonk (Added image and caption)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Template:Short description Template:Lowercase title Template:Fundamental info units

Units of information measurement.
Units of information measurement.

The natural unit of information (symbol: nat),Template:Refn sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms, which define the shannon. This unit is also known by its unit symbol, the nat. One nat is the information content of an event when the probability of that event occurring is 1/e.

One nat is equal to Template:Sfrac shannons ≈ 1.44 Sh or, equivalently, Template:Sfrac hartleys ≈ 0.434 Hart.Template:Refn

History

Boulton and Wallace used the term nit in conjunction with minimum message length,Template:Refn which was subsequently changed by the minimum description length community to nat to avoid confusion with the nit used as a unit of luminance.Template:Refn

Alan Turing used the natural ban.Template:Refn

Entropy

Shannon entropy (information entropy), being the expected value of the information of an event, is inherently a quantity of the same type and with a unit of information. The International System of Units, by assigning the same unit (joule per kelvin) both to heat capacity and to thermodynamic entropy implicitly treats information entropy as a quantity of dimension one, with Template:Nowrap.Template:Efn Systems of natural units that normalize the Boltzmann constant to 1 are effectively measuring thermodynamic entropy with the nat as unit.

When the shannon entropy is written using a natural logarithm, H=ipilnpi it is implicitly giving a number measured in nats.

Notes

Template:Notelist

References

Template:Reflist

Further reading