Math, asked by vasanthvelavan3080, 11 months ago

Element 0 of tensors does not require grad and does not have a grad_fn nllloss

Answers

Answered by Anonymous
0

Answer:


Step-by-step explanation:

hi, i have a problem here, i got a sequence of Variables which are the outputs of the bi-directional RNN, and i stacked them into a matrix xs_h whose dimension is (seq_length, batch_size, hidden_size), them i want to update the matrix xs_h by convoluting on two slices in xs_h, some codes are as follows:

new_xs_h = xs_h.clone()

vp, vc = xs_h[idx_0, bidx], xs_h[idx_1, bidx]

x = tc.stack([self.f1(vp), self.f2(vc)], dim=1)[None, :, :]

new_xs_h[idx_1, bidx] = self.tanh(self.l_f2(self.conv(x).squeeze()))

actually, i want to update the Variable xs_h and then let the new updated matrix new_xs_h get into my computation graph again. However, i got following errors when i call backward() after the running of above code:

RuntimeError: element 0 of variables does not require grad and does not have a grad_fn

i do not kown why, any reply will be appreciated.

thanks.

Similar questions