Advertisement

Advertisement

Markov process

or Markoff process

noun

, Statistics.
  1. a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.


Discover More

Word History and Origins

Origin of Markov process1

1935–40; after Russian mathematician Andreĭ Andreevich Markov (1856–1922), who developed it

Advertisement

Word of the Day

gallimaufry

[gal-uh-maw-free ]

Meaning and examples

Start each day with the Word of the Day in your inbox!

By clicking "Sign Up", you are accepting Dictionary.com Terms & Conditions and Privacy Policies.

Advertisement

Advertisement

Advertisement


Markov chainMarkowitz