Skip to main content

Introduction to Markov Chains

  • Chapter
  • First Online:
Queues

Part of the book series: International Series in Operations Research & Management Science ((ISOR,volume 191))

  • 2817 Accesses

Abstract

The topic of Markov processes is huge. A number of volumes can be, and in fact were, written on this topic. We have no intentions to be complete in this area. What is given in this chapter is the minimum required in order to follow what is presented afterwards. In particular, we will refer at times to this chapter when results we present here are called for. For more comprehensive coverage of the topic of Markov chains and stochastic matrices, see [9, 19, 41] or [42].

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 54.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    A square matrix is called stochastic if all its entries are nonnegative and all its row sums equal one. It is substochastic if its row sums are less than or equal to one.

  2. 2.

    This does not rule out the possibility that a state j in one class is reachable from a state i in some other class (but then, of course, i is not reachable from j).

  3. 3.

    The rationale behind this terminology is that for n large enough, P ii n > 0 if and only if n = 0 mod d(i).

  4. 4.

    The period is a function only of the graph associated with the Markov chain. In particular, once P ij is positive, its actual value is immaterial from the point of view of deriving the period’s value.

  5. 5.

    This proof appears in [16], p. 165.

  6. 6.

    The proof is as follows. Of course, P ij is the probability of moving straight to state-j. In the new process there is, however, another option to visit state-j just after state-i; that is, go first to state-n (probability of P in ) and, conditioning on leaving state-n, move immediately to state-j (probability \(P_{nj}/(1 - P_{nn})\)).

  7. 7.

    The Perron-Frobinius theorem guarantees that this eigenvalue is real and unique in the case where P JJ is aperiodic and irreducible. See, e.g., [42], p. 9.

References

  1. Billingsley, P. (1995). Probability and measure (3rd ed.). New York: Wiley.

    Google Scholar 

  2. Denardo, E. V. (1982). Dynamic programming: models and applications. Englewood Cliffs: Prentice-Hall.

    Google Scholar 

  3. Feller, W. (1968). An introduction to probability theory and its applications (3rd ed.). New York: Wiley.

    Google Scholar 

  4. Kemeny, J. K., & Snell, J. L. (1961). Finite Markov chains. New York: D. Van Nostrand.

    Google Scholar 

  5. Ross, S. M. (1996). Stochastic processes (2nd ed.). New York: Wiley.

    Google Scholar 

  6. Seneta, E. (2006). Non-negative matrices and Markov Chains: revised prinitng. New York: Springer.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer Science+Business Media New York

About this chapter

Cite this chapter

Haviv, M. (2013). Introduction to Markov Chains. In: Queues. International Series in Operations Research & Management Science, vol 191. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-6765-6_3

Download citation

Publish with us

Policies and ethics