Originally posted on alexhwoods:
A Markov Chain is a random process, where we assume the previous state(s) hold sufficient predictive power in predicting the next state. Unlike flipping a coin, these events are dependent. It’s easier to understand through an example.
Imagine the weather can only be rainy or sunny. That is, the state space is rainy or sunny. We can represent our Markov model as a transition matrix, with each row being a state, and each column being the probability it moves to another.
However, it’s easier to understand with this state transition diagram.
In other words, given today is sunny, there is a .9 probability that tomorrow will be sunny, and a .1 probability that tomorrow will be rainy.
Text Generator
One cool application of this is a language model, in which we predict the next word based on the current word(s). If we just predict based on the last word…
View original507 more words