Chemistry, asked by kvinu1999, 2 months ago

Suppose we have a 11-layer neural network which takes 10 hours to train on a GPU
with 32GB VRAM. At test time, it takes 5 seconds for single data point.
Now we change the architecture such that we add dropout after 2nd, 4th, 6th and
8th layer with rate 0.2 for each layer respectively.
What would be the testing time for this new architecture?
A Doubles
B Halves
C Same
D Can?t Say​

Answers

Answered by vermahariom1000
0

Answer:

A

Explanation:

bsnsndmdndnrmrbbfbbtvt

Answered by rishubcheddllap8s1yd
0

Answer:

D

Explanation:

Because dropouts are random so they decrease training time

so it has to lower or the same time but definitely not halved

Similar questions