Someone recently bought our

students are currently browsing our notes.

X

## Markov Chains Notes This is an extract of our Markov Chains document, which we sell as part of our Operational Research Techniques Notes collection written by the top tier of LSE students.

The following is a more accessble plain text extract of the PDF sample above, taken from our Operational Research Techniques Notes. Due to the challenges of extracting text from PDFs, it will have odd formatting:

Lecture 15: Markov Chains Summary

* Introduction

* Markov Chains

Introduction

* Stochastic Processes = Processes that evolve over time in a probabilistic manner
* Most processes are stochastic
? If not, the future is fully determined

* We make the assumption that stochastic processes are fully described by two sets of information:
* The current state
* Transition probabilities

* Stochastic processes with such assumptions are Markov Processes

* The simplifying assumption is that history does not matter

* Markov Chain = Markov Process in discrete time (i.e. weeks or generations) Example of modelling a Markov Chain We can use a Markov Chain to model the future class structure of a society

*

* Current state = current class structure (what proportion of individuals are in each class)

* Transition probabilities = i.e. the probability that the son will be each class given that his father was upper class etc.

* [?] future class structure depends on current and we do not need to know about the past

* Another assumption that is commonly made:
* Stationary transition probabilities = That the transition probabilities stay the same
* With this we can model the long run equilibrium state
* These determine the long run outcomes
? Not the current distribution

* Good at modelling market share in the short run

Markov Chains

* Examples include:
* Brand switching in consumer purchases
* Changes in social class over generations
* Changes in staff employed at different levels in firms
* Progress of a disease in populations

* Models movement between different states over time

* In any time period (stage) a unit will be in one and only one state

* Between states there can be a transition to any of a number of other states

Transition Probabilities
* Transition probability from state to :In Markov chains, it depends on and and not on how state was reached Course Notes Page 37

Buy the full version of these notes or essay plans and more in our Operational Research Techniques Notes.