-
Notifications
You must be signed in to change notification settings - Fork 914
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it possible to see gradient function? #587
Comments
Have you got the answer? I have the same problem. I need to get the formula of the first derivative of the function |
I do not know whether this is possible with autograd, but if your function is simple enough, https://www.matrixcalculus.org/ can give you both the mathematical formula and a Python program to compute the gradient. Using your function I used the Jacobian here since import autograd.numpy as np
from autograd import jacobian
# Output from https://matrixcalculus.org
def fAndG(x):
assert isinstance(x, np.ndarray)
dim = x.shape
assert len(dim) == 1
x_rows = dim[0]
t_0 = np.exp(-(2 * x))
t_1 = (np.ones(x_rows) + t_0)
t_2 = (np.ones(x_rows) - t_0)
functionValue = (t_2 / t_1)
gradient = ((2 * np.diag((t_0 / t_1))) + (2 * np.diag(((t_2 * t_0) / (t_1 * t_1)))))
return functionValue, gradient
def tanh(x):
y = np.exp(-2.0 * x)
return (1.0 - y) / (1.0 + y)
x = np.array([0.1, 0.2, 0.3])
tanh_x, jacobian_tanh_x = fAndG(x)
# tanh(x) matches
assert np.allclose(tanh_x, tanh(x))
# Jacobian matches
assert np.allclose(jacobian_tanh_x, jacobian(tanh)(x)) You could also visualize the computational graph with Jax, but that is a bit more involved: https://bnikolic.co.uk/blog/python/jax/2020/10/20/jax-outputgraph.html |
Hi, when I use autograd, it is possible to see its gradient function? Or in other words, it is possible to see derivative of that function? Or is it possible to see computational graph?
For example, I want to see grad_tanh function
Thank you
The text was updated successfully, but these errors were encountered: