Computer Science, asked by fizakhan2004, 1 month ago

Q:10. Bidirectional RNN:
1. Trained to predict both the positive and
negative directions of time simultaneously.
2. Applications are speech recognition,
handwritten recognition etc.
3. After forward and backward passes are
done, the weights are updated
4.
All the above
enter reason here​

Answers

Answered by kavyagaharwar
0

Answer:

Bidirectional Recurrent Neural Networks (BRNN) connect two hidden layers of opposite directions to the same output. With this form of generative deep learning, the output layer can get information from past (backwards) and future (forward) states simultaneously. Invented in 1997 by Schuster and Paliwal,[1] BRNNs were introduced to increase the amount of input information available to the network. For example, multilayer perceptron (MLPs) and time delay neural network (TDNNs) have limitations on the input data flexibility, as they require their input data to be fixed. Standard recurrent neural network (RNNs) also have restrictions as the future input information cannot be reached from the current state. On the contrary, BRNNs do not require their input data to be fixed. Moreover, their future input information is reachable from the current state. [2]

Similar questions