Advertisement
Advertisement
Markov process
or Markoff process
noun
, Statistics.
- a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.
Discover More
Word History and Origins
Origin of Markov process1
1935–40; after Russian mathematician Andreĭ Andreevich Markov (1856–1922), who developed it
Discover More
Example Sentences
“A Markov process is where you have a sequence of numbers or letters or notes, and the probability of any particular note depends only on the few notes that have come before,” said Kershenbaum.
From Washington Post
A “Markov process” or “Markov chain” is a sequence of random states in which the probability of what comes at the next time step depends only on the current state and not on anything earlier.
From Scientific American
Advertisement
Advertisement
Advertisement
Advertisement
Browse