Advertisement
Advertisement
Markov process
or Markoff process
noun
, Statistics.
- a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.
Discover More
Word History and Origins
Origin of Markov process1
1935–40; after Russian mathematician Andreĭ Andreevich Markov (1856–1922), who developed it
Advertisement
Advertisement
Advertisement
Advertisement
Browse