Discrete time markov chain vs continuous time markov chain
Answers
Answered by
1
Answer:
A Markov chain is a discrete-valued Markov process. Discrete-valued means that the state space of possible values of the Markov chain is finite or countable. ... A continuous-time Markov chain is one in which changes to the system can happen at any time along a continuous interva
Answered by
0
Answer:
A Markov chain is a discrete-valued Markov process. Discrete-valued means that the state space of possible values of the Markov chain is finite or countable. ... A continuous-time Markov chain is one in which changes to the system can happen at any time along a continuous interval.
Similar questions