Math, asked by rssahu7970, 1 year ago

Define regular Marcov chain.

Answers

Answered by Anonymous
1
Regular Markov Chains.DEFINITION 1. A transition matrix (stochastic matrix) is said to beregular if some power of T has all positive entries (i.e. strictly greater than zero). The Markov chainrepresented by T is called a regular Markov chain.



pls mark as brainlist answer ..pls
Answered by payal961
2
Regular Markov Chains.

  A transition matrix (stochastic matrix) is said to beregular if some power of T has all positive entries (i.e. strictly greater than zero). The Markov chainrepresented by T is called a regular Markov chain.
Similar questions