Science, asked by nikhatj145, 1 year ago

Absorbing state of markov chain definition

Answers

Answered by NaughtySage
0
In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach anabsorbing state. An absorbing state is a state that, once entered, cannot be left.
Answered by shaynakhan
0
in mathematical theory of probability, Markov chain in which every state can reach an absorbing change.
Similar questions