Markov Chain

214 views

I know nothing about stochastics, and this has been a bit hard to wrap my head around. What are their processes and purposes?

In: 1

3 Answers

Anonymous 0 Comments

Let’s use weather as an illustrative example.

Our weather has three states: sunny, cloudy, and rainy. We want to predict what the weather will look like in the next hour. A Markov Chain would indicate the probabilities of going from one state to the next.

If it’s sunny out, we could say that there’s a 33% chance of it staying sunny, 33% likely to become cloudy, and 33% likely to become rainy.

Now, these probabilities can change depending on what state you’re currently in. If it’s cloudy, maybe the probabilities are like 10% chance of getting sunny, 20% chance of staying cloudy, and 70% chance of becoming rainy.

You can visualize these probabilities using arrows like [this.](https://www.vatsalp.com/post/markov-chain/fig2.png)

You are viewing 1 out of 3 answers, click here to view all answers.