Lumpability: Difference between revisions
imported>Monkbot m Task 18 (cosmetic): eval 3 templates: del empty params (2×); hyphenate params (2×); |
(No difference)
|
Latest revision as of 06:59, 14 December 2020
In probability theory, lumpability is a method for reducing the size of the state space of some continuous-time Markov chains, first published by Kemeny and Snell.[1]
Definition
Suppose that the complete state-space of a Markov chain is divided into disjoint subsets of states, where these subsets are denoted by ti. This forms a partition of the states. Both the state-space and the collection of subsets may be either finite or countably infinite. A continuous-time Markov chain is lumpable with respect to the partition T if and only if, for any subsets ti and tj in the partition, and for any states n,n’ in subset ti,
where q(i,j) is the transition rate from state i to state j.[2]
Similarly, for a stochastic matrix P, P is a lumpable matrix on a partition T if and only if, for any subsets ti and tj in the partition, and for any states n,n’ in subset ti,
where p(i,j) is the probability of moving from state i to state j.[3]
Example
Consider the matrix
and notice it is lumpable on the partition t = {(1,2),(3,4)} so we write
and call Pt the lumped matrix of P on t.
Successively lumpable processes
In 2012, Katehakis and Smit discovered the Successively Lumpable processes for which the stationary probabilities can be obtained by successively computing the stationary probabilities of a propitiously constructed sequence of Markov chains. Each of the latter chains has a (typically much) smaller state space and this yields significant computational improvements. These results have many applications reliability and queueing models and problems.[4]
Quasi–lumpability
Franceschinis and Muntz introduced quasi-lumpability, a property whereby a small change in the rate matrix makes the chain lumpable.[5]
See also
References
- ↑ Template:Cite book
- ↑ Jane Hillston, Compositional Markovian Modelling Using A Process Algebra in the Proceedings of the Second International Workshop on Numerical Solution of Markov Chains: Computations with Markov Chains, Raleigh, North Carolina, January 1995. Kluwer Academic Press
- ↑ Peter G. Harrison and Naresh M. Patel, Performance Modelling of Communication Networks and Computer Architectures Addison-Wesley 1992
- ↑ Template:Cite journal
- ↑ Template:Cite journal