Economy, asked by jaisimhakr, 1 year ago

a deep belief network is a stack of restricted boltzmann machines. true or false?

Answers

Answered by mchatterjee
0

Although Deep Belief Networks (DBNs) and Deep Boltzmann Machines (DBMs) diagrammatically look very similar, they are actually qualitatively very different. This is because DBNs are directed and DBMs are undirected. If we wanted to fit them into the broader ML picture we could say DBNs are sigmoid belief networks with many densely connected layers of latent variables and DBMs are markov random fields with many densely connected layers of latent variables.

As such they inherit all the properties of these models. For example, in a DBN computing P(v|h), where v is the visible layer and h are the hidden variables is easy. On the other hand computing P of anything is normally computationally infeasible in a DBM because of the intractable partition function.

That being said there are similarities. For example:

DBNs and the original DBM work both using initialization schemes based on greedy layerwise training of restricted Bolzmann machines (RBMs),
They are both "deep".
They both feature layers of latent variables which are densely connected to the layers above and below, but have no intralayer connections, etc.

Answered by topanswers
5

Answer is TRUE.

Few more details:

Restricted boltzmann machines (RBM) is a model that deals with probabilistic energy functions.

RBMs are trained in a cumulatively way and RBMs are stacked one upon another and it forms the Deep Belief Network (DBN). This method of stacking is called as 'pre-training' of the Deep Belief Network.

Hope it helps. Thank You!

Similar questions