Absorbing state of markov chain definition
Answers
Answered by
0
In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach anabsorbing state. An absorbing state is a state that, once entered, cannot be left.
Answered by
0
in mathematical theory of probability, Markov chain in which every state can reach an absorbing change.
Similar questions