Define regular Marcov chain.
Answers
Answered by
1
Regular Markov Chains.DEFINITION 1. A transition matrix (stochastic matrix) is said to beregular if some power of T has all positive entries (i.e. strictly greater than zero). The Markov chainrepresented by T is called a regular Markov chain.
pls mark as brainlist answer ..pls
pls mark as brainlist answer ..pls
Answered by
2
Regular Markov Chains.
A transition matrix (stochastic matrix) is said to beregular if some power of T has all positive entries (i.e. strictly greater than zero). The Markov chainrepresented by T is called a regular Markov chain.
A transition matrix (stochastic matrix) is said to beregular if some power of T has all positive entries (i.e. strictly greater than zero). The Markov chainrepresented by T is called a regular Markov chain.
Similar questions