Krichevsky–Trofimov estimator

From testwiki
Jump to navigation Jump to search

Template:One source In information theory, given an unknown stationary source Template:Pi with alphabet A and a sample w from Template:Pi, the Krichevsky–Trofimov (KT) estimator produces an estimate pi(w) of the probability of each symbol i ∈ A. This estimator is optimal in the sense that it minimizes the worst-case regret asymptotically.

For a binary alphabet and a string w with m zeroes and n ones, the KT estimator pi(w) is defined as:[1]

p0(w)=m+1/2m+n+1,p1(w)=n+1/2m+n+1.

This corresponds to the posterior mean of a Beta-Bernoulli posterior distribution with prior 1/2. For the general case the estimate is made using a Dirichlet-Categorical distribution.

See also

References

Template:Reflist


Template:Probability-stub