Difference between markov chain and markov decision process
Answers
Answered by
0
Markov decision processes are an extension of Markov chains; the difference is the addition of actions (allowing choice) and rewards (giving motivation). Conversely, if only one action exists for each state (e.g. "wait") and all rewards are the same (e.g. "zero"), a Markov decision process reduces to a Markov chain.
I hope its helpful to you
PLEASE MARK AS BRAINLEIST
Similar questions