Computer Science, asked by chaitanyaceo4, 4 months ago

Consider a neural network with 2 input neurons - i1 and i2, one hidden layer
with 2 neurons - h1, h2, and an output layer with 2 neurons - o1, o2. Assume i1 = 0.1,
w1(from i1 to h1) = 0.27, w2 (from i2 to h1) = 0.57 and bias b1 = 1 and w4 (weight from
b1 to h1) = 0.4. Draw the network, mark all the neurons, weights, and the necessary
connections. Work out the weight calculation for all the neurons by assuming tanh as the
activation function. Find out the optimum weights at the end of back propagation
algorithm.

Answers

Answered by zainu4864
0

Answer:

hello friends good morning..........

Similar questions