A Markov chain is a probability system that governs transition among states or through successive events. For example, in the American game of baseball, the probability of reaching base differs depending on the “count” — the number of balls and strikes facing the batter. 3 balls and 0 strikes has a probability of 0.750 of reaching base, while 0 balls and 2 strikes has only a 0.221 probability of reaching base. A simplified system would have a state corresponding to each combination of balls and strikes. In a Monte Carlo process, pitches will be simulated over and over, with the outcome probabilities determined by the state at each pitch, and those outcomes then determining the next state and its probabilities (going back at 0-0 when a batter either reaches base or records an out).
Week #17 – . Markov Chain Monte Carlo (MCMC)
A Markov chain is a probability system that governs transition among states or through successive events.