Applying Matrices to Probability Problems
A Markov chain is a sequence of events where the probability of one event is dependent on a previous event.
A transition matrix is a square matrix where each element represents each possible outcome of a Markov chain.
For example, let’s use events \(A\) and \(B\) to represent two independent events. The transition matrix would look something akin to the following:
\(P = \begin{pmatrix} P(A | A) & P(B | A) \\ P(A | B) & P(B | B) \end{pmatrix} = \begin{pmatrix} 0.45 & 0.55 \\ 0.65 & 0.35 \end{pmatrix}\)
According to the above transition matrix, if event \(A\) occurs then the probability of event \(A\) continuing to occur is \(0.45\) while the probability of event \(A\) occurring and then event \(B\) occurring is \(0.55\).
Additionally, the probability of event \(B\) and then event \(A\) occurring is \(0.65\) and the probability of event \(B\) occurring and then continuing to occur is \(0.35\).
There is also the initial state vector which is the probability distribution of the two events independently. Using the above example, it may look as such:
\(S^{(0)} = \begin{pmatrix} P(A) & P(B) \end{pmatrix} = \begin{pmatrix} 0.4 & 0.6 \end{pmatrix}\)
According to the initial state vector, the probability of event \(A\) occurring is \(0.4\) and the probability of event \(B\) occurring is \(0.6\).
Example
In a game, a player can only be in one of two possible states. Either they are in-game (I) or eliminated (E). The following is the transition matrix of the system:
\(P = \begin{pmatrix} 0.7 & 0.3 \\ 0 & 1 \end{pmatrix}\)
The following is the initial state vector:
\(S^{(0)} = \begin{pmatrix} 0.65 & 0.35 \end{pmatrix}\)
Find the probability distribution of the players after one round.
To find the probability distribution, we can use the following formula:
\(S^{(1)} = S^{(0)} \times P\)
After plugging the matrices into this formula, we can expand the expression and simplify:
\(S^{(1)} = \begin{pmatrix} 0.65 & 0.35 \end{pmatrix} \times \begin{pmatrix} 0.7 & 0.3 \\ 0 & 1 \end{pmatrix}\)
\(S^{(1)} = \begin{pmatrix} 0.7 \times 0.65 + 0 \times 0.35 & 0.3 \times 0.65 + 1 \times 0.35 \end{pmatrix}\)
\(S^{(1)} = \begin{pmatrix} 0.455 & 0.545 \end{pmatrix}\)
Therefore, after one round, we can determine the probability of a player staying in-game is \(\boldsymbol{0.455}\) and the probability of a player being eliminated by the end of one round is \(\boldsymbol{0.545}\).
During the summer a particular city frequently fluctuates between a sunny day and thunderstorms. The transition matrix is:
\(P = \begin{pmatrix} 0.6 & 0.4 \\ 0.9 & 0.1 \end{pmatrix}\)
And its initial state is:
\(S^{(0)} = \begin{pmatrix} 0.7 & 0.3 \end{pmatrix}\)
What is the probability distribution after two days?
To find the probability distribution, we can use the following formula:
\(S^{(1)} = S^{(0)} \times P\)
After plugging the matrices into this formula, we can expand the expression and simplify:
\(S^{(1)} = \begin{pmatrix} 0.7 & 0.3 \end{pmatrix} \times \begin{pmatrix} 0.6 & 0.4 \\ 0.9 & 0.1 \end{pmatrix}\)
\(S^{(1)} = \begin{pmatrix} 0.6 \times 0.7 + 0.9 \times 0.3 & 0.4 \times 0.7 + 0.1 \times 0.3 \end{pmatrix}\)
\(S^{(1)} = \begin{pmatrix} 0.69 & 0.31 \end{pmatrix}\)
Although we have determined the distribution for one day, if we want to find the distribution for two days, then you can use the same formula as before, but with \(S(1)\) being used instead of \(S(0)\) with the goal of finding \(S(2)\).
We can express this algebraically as such:
\(S^{(2)} = S^{(1)} \times P\)
Then, we can plug the appropriate matrices into the expression and solve for the distribution:
\(S^{(2)} = \begin{pmatrix} 0.69 & 0.31 \end{pmatrix} \times \begin{pmatrix} 0.6 & 0.4 \\ 0.9 & 0.1 \end{pmatrix}\)
\(S^{(2)} = \begin{pmatrix} 0.6 \times 0.69 + 0.9 \times 0.31 & 0.4 \times 0.69 + 0.1 \times 0.31 \end{pmatrix}\)
\(S^{(2)} = \begin{pmatrix} 0.693 & 0.31\end{pmatrix}\)
Therefore, we can determine the probability of the weather being sunny for 2 days is \(\boldsymbol{0.693}\) and the probability for 2 days of thunderstorms is \(\boldsymbol{0.343}\).