Advertisement
Advertisement
Markov chain
or Mar·koff chain
[ mahr-kawf ]
noun
, Statistics.
- a Markov process restricted to discrete random events or to discontinuous time sequences.
Markov chain
/ ˈmɑːkɒf /
noun
- statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it
Discover More
Word History and Origins
Origin of Markov chain1
First recorded in 1940–45; Markov process
Discover More
Word History and Origins
Origin of Markov chain1
C20: named after Andrei Markov (1856–1922), Russian mathematician
Discover More
Example Sentences
These rules could be decomposed into two sets that dominate at distinct length scales -- Markov chain and random nuclei.
From Science Daily
The Markov chain, simple as it is, somehow captures something of the style of naming practices of different eras.
From Scientific American
My ego would like me to believe that my writing process is a little more complicated than a Markov chain.
From The Verge
The full Markov chain Monte Carlo analysis and uncertainties are discussed in Methods.
From Nature
The main tool in the Duke paper is a method called the “Markov chain Monte Carlo” algorithm.
From New York Times
Advertisement
Advertisement
Advertisement
Advertisement
Browse