What is MARKOV CHAIN?

Sequence of stochastic events based on probabilities instead of certainties. The current state of a variable or system is independent of all past instances, except for he current present state. Stock/share price movement, and a firm’s market share increasing or decreasing, are examples of Markov chains. Named after the Russian mathematician Andrei Andreevich Markov (1856-1922), the inventor of Markov analysis. Also known as Markov model. Also refer to Markov process.

More On This Topic



Link to This Definition
Did you find this definition of MARKOV CHAIN helpful? You can share it by copying the code below and adding it to your blog or web page.
Written and fact checked by The Law Dictionary