English, asked by sdfsfs36591, 10 months ago

Discrete time markov chain vs continuous time markov chain

Answers

Answered by navjotn406
1

Answer:

A Markov chain is a discrete-valued Markov process. Discrete-valued means that the state space of possible values of the Markov chain is finite or countable. ... A continuous-time Markov chain is one in which changes to the system can happen at any time along a continuous interva

Answered by missNAV143957
0

Answer:

A Markov chain is a discrete-valued Markov process. Discrete-valued means that the state space of possible values of the Markov chain is finite or countable. ... A continuous-time Markov chain is one in which changes to the system can happen at any time along a continuous interval.

Similar questions