English, asked by vishnukumar00021, 10 months ago

efine (i) An Ergodic Markov Chain, (ii) Stationary Markov Chain.​

Answers

Answered by mufaddalt786
1

Answer:

A Markov chain is called an ergodic or irreducible Markov chain if it is possible to eventually get from every state to every other state with positive probability. ..

A Markov chain whose stationary distribution π and transition probability matrix P satisfy (1) is called reversible. ... Then, the length of the queue is a Markov chain, and in fact it turns out to be reversible.

Explanation:

Similar questions