Skip to main content

SAGE Social Science Thesaurus

Search from vocabulary

Concept information

Preferred term

markov chains  

Definition

  • The topic of Markov chains is a well-developed topic in probability. There are many fine expositions of Markov chains (e.g., Bremaud, 2008; Feller, 1968; Hoel, Port, & Stone, 1972; Kemeny & Snell, 1960). [Source: Encyclopedia of Research Design; Markov Chains]

Belongs to group

URI

http://data.loterre.fr/ark:/67375/N9J-TC78PZ9T-D

Download this concept: