Continuity in probability

From testwiki
Jump to navigation Jump to search

In probability theory, a stochastic process is said to be continuous in probability or stochastically continuous if its distributions converge whenever the values in the index set converge. [1][2]

Definition

Let X=(Xt)tT be a stochastic process in n. The process X is continuous in probability when Xr converges in probability to Xs whenever r converges to s.[2]

Examples and Applications

Feller processes are continuous in probability at t=0. Continuity in probability is a sometimes used as one of the defining property for Lévy process.[1] Any process that is continuous in probability and has independent increments has a version that is càdlàg.[2] As a result, some authors immediately define Lévy process as being càdlàg and having independent increments.[3]

References