We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
At the moment, if you set num_derivatives to something lower than the actual number of derivatives you compute, you'll get something like:
---------------------------------------------------------------------- Traceback (most recent call last): File "test/test_jit.py", line 539, in test_mini_wlm z.sum().backward() File "/data/users/ezyang/onnx-pytorch/pytorch/torch/autograd/variable.py", line 158, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables) File "/data/users/ezyang/onnx-pytorch/pytorch/torch/autograd/__init__.py", line 98, in backward variables, grad_variables, retain_graph) RuntimeError: vector::_M_range_check
The text was updated successfully, but these errors were encountered:
No branches or pull requests
At the moment, if you set num_derivatives to something lower than the actual number of derivatives you compute, you'll get something like:
The text was updated successfully, but these errors were encountered: