Advertisement
Advertisement
Markov chain
or Mar·koff chain
[ mahr-kawf ]
noun
, Statistics.
- a Markov process restricted to discrete random events or to discontinuous time sequences.
Markov chain
/ ˈmɑːkɒf /
noun
- statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it
Discover More
Word History and Origins
Origin of Markov chain1
First recorded in 1940–45; Markov process
Discover More
Word History and Origins
Origin of Markov chain1
C20: named after Andrei Markov (1856–1922), Russian mathematician
Advertisement
Advertisement
Advertisement
Advertisement
Browse