Advertisement
Advertisement
Markov process
or Markoff process
noun
, Statistics.
- a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.
Discover More
Word History and Origins
Origin of Markov process1
1935–40; after Russian mathematician Andreĭ Andreevich Markov (1856–1922), who developed it
Advertisement
Word of the Day
[gal-uh-maw-free ]
Meaning and examplesStart each day with the Word of the Day in your inbox!
By clicking "Sign Up", you are accepting Dictionary.com Terms & Conditions and Privacy Policies.
Advertisement
Advertisement
Advertisement
Browse