User:Spencerhongcornell/sandbox

A Markov chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.[1][2][3]

In probability theory and related fields, a Markov process, named after the Russian mathematician Andrey Markov, is a stochastic process that satisfies the Markov property[1][4][5] (sometimes characterized as "memorylessness"). Roughly speaking, a process satisfies the Markov property if one can make predictions for the future of the process based solely on its present state just as well as one could knowing the process's full history, hence independently from such history. In other words, conditional on the present state of the system, its future and past states are independent.

A Markov chain is a type of Markov process that has either a discrete state space or a discrete index set (often representing time), but the precise definition of a Markov chain varies.[6] For example, it is common to define a Markov chain as a Markov process in either discrete or continuous time with a countable state space (thus regardless of the nature of time),[7][8][9][10] but it is also common to define a Markov chain as having discrete time in either countable or continuous state space (thus regardless of the state space).[6]

Markov studied Markov processes in the early 20th century, publishing his first paper on the topic in 1906.[11][12][13][14] Random walks based on integers and the gambler's ruin problem are examples of Markov processes.[15][16] Some variations of these processes were studied hundreds of years earlier in the context of independent variables.[17][18][19] Two important examples of Markov processes are the Wiener process, also known as the Brownian motion process, and the Poisson process,[20] which are considered the most important and central stochastic processes in the theory of stochastic processes,[21][22][23] and were discovered repeatedly and independently, both before and after 1906, in various settings.[24][25] These two processes are Markov processes in continuous time, while random walks on the integers and the gambler's ruin problem are examples of Markov processes in discrete time.[15][16]

Markov chains have many applications as statistical models of real-world processes,[1][26][27][28] such as studying cruise control systems in motor vehicles, queues or lines of customers arriving at an airport, exchange rates of currencies, storage systems such as dams, and population growths of certain animal species.[29] The algorithm known as PageRank, which was originally proposed for the internet search engine Google, is based on a Markov process.[1][30][31]

Markov processes are the basis for general stochastic simulation methods known as Markov chain Monte Carlo, which are used for simulating sampling from complex probability distributions, and have found extensive application in Bayesian statistics.[29][32][33]