Markov chain how to find intial state probablity
Answers
Answered by
1
You define the initial state vector. The vector <0,1,0,…,0> means they all start in start 1. You may use <p,p,p,…,> where p=1/n and n is the number of states. This is to mean it is equally likely to start in any state.
It all depends on the context. For gamblers ruin, the initial state vector will be <1,0,0,…,0>. You start with full amount n. That is state 0. There are n+1 states if each gamble is $1
It all depends on the context. For gamblers ruin, the initial state vector will be <1,0,0,…,0>. You start with full amount n. That is state 0. There are n+1 states if each gamble is $1
Similar questions