Skip to main content

Markov Chains and Applications

  • Chapter
  • First Online:
Fundamentals of Probability: A First Course

Part of the book series: Springer Texts in Statistics ((STS))

  • 8399 Accesses

Abstract

In many applications, successive observations of a process, say X 1, X 2,…, have an inherent time component associated with them. For example, the X i could be the state of the weather at a particular location on the i th day, counting from some fixed day. In a simplistic model, the state of the weather could be “dry” or “wet,” quantified as, say, 0 and 1. It is hard to believe that in such an example, the sequence X 1, X 2,… could be mutually independent. The question then arises how to model the dependence among the X i s. Probabilists have numerous dependency models. A particular model that has earned a very special status is called the Markov chain. In a Markov chain model, we assume that the future, given the entire past and the present state of a process, depends only on the present state. In the weather example, suppose we want to assign a probability that tomorrow, say March 10, will be dry, and suppose that we have available to us the precipitation history for each day from January 1 to March 9. In a Markov chain model, our probability that March 10 will be dry will depend only on the state of the weather on March 9, even though the entire past precipitation history was available to us. As simple as it sounds, Markov chains are enormously useful in applications, perhaps more than any other specific dependency model. They also are independently relevant to statistical computing in very important ways. The topic has an incredibly rich and well-developed theory, with links to many other topics in probability theory. Familiarity with basic Markov chain terminology and theory is often considered essential for anyone interested in studying statistics and probability.We present an introduction to basic Markov chain theory in this chapter.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Bhattacharya, R.N. and Waymire, E. (2009). Stochastic Processes with Applications, SIAM, Philadelphia.

    Google Scholar 

  • Brémaud, P. (1999). Markov Chains, Gibbs Fields, Monte Carlo, and Queues, Springer, New York.

    Google Scholar 

  • Diaconis, P. (1988). Group Representations in Probability and Statistics, IMS Lecture Notes and Monographs Series, Hayward, CA.

    Google Scholar 

  • Feller, W. (1968). An Introduction to Probability Theory, with Applications, Wiley, New York.

    Google Scholar 

  • Freedman, D. (1975). Markov Chains, Holden-Day, San Francisco.

    Google Scholar 

  • Isaacson, D. and Madsen, R. (1976). Markov Chains, Theory and Applications, Wiley, New York.

    Google Scholar 

  • Kemperman, J. (1950). The General One-Dimensional Random Walk with Absorbing Barriers, Geboren Te, Amsterdam.

    Google Scholar 

  • Meyn, S. and Tweedie, R. (1993). Markov Chains and Stochastic Stability, Springer, New York.

    Google Scholar 

  • Norris, J. (1997). Markov Chains, Cambridge University Press, Cambridge.

    Google Scholar 

  • Seneta, E. (1981). Nonnegative Matrices and Markov Chains, Springer-Verlag, New York.

    Google Scholar 

  • Stirzaker, D. (1994). Elementary Probability, Cambridge University Press, Cambridge.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anirban DasGupta .

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag New York

About this chapter

Cite this chapter

DasGupta, A. (2010). Markov Chains and Applications. In: Fundamentals of Probability: A First Course. Springer Texts in Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-5780-1_14

Download citation

Publish with us

Policies and ethics