Return to Index
Operations Research Models and Methods
 
Models Section


DTMC

Terminology

Example Problem

 

DTMC Tour

 

 

Discrete Time Markov Chains

We now investigate a finite-state stochastic process in which the defining random variables are observed at discrete points in time. When the future probabilistic behavior of the process depends only on the present state regardless of when the present state is measured, the resultant model is called a Markov chain. When time is measured in discrete intervals the model is called the Discrete Time Markov Chain (DTMC). The term Markov Chain is more general that DTMC because it also includes continuous time processes called Continuous Time Markov Chains (CTMC).We consider CTMC in a later section. We use the term Markov chain when a comment refers to both CTMC and DTMC.

Unlike most stochastic processes, Markov chains have very agreeable properties allowing for easy study. Often they are used to approximate quite complex physical systems, even when it is clear that the actual behavior of the system being analyzed may depend on more than just the present state, or when the number of states is not really finite.

To develop a model of a DTMC we need to define the system states S and specify the one-step transition matrix P. Given this information, computational procedures are available to answer questions related to the steady-state behavior of the DTMC. In particular we can compute: the t-step transition matrix, transient and steady-state probability vectors, absorbing state probabilities, and first passage probability distributions. Integrating these results with economic data leads directly to a framework for systems design and optimal decision making under uncertainty.




  
Return to Top

tree roots

Operations Research Models and Methods
Internet
by Paul A. Jensen
Copyright 2004 - All rights reserved