You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried to compare the values obtained from computing the jacobian of a complex valued function using numdifftools and found that it differed from that computed using autograd only for the parameter having 1j. i would like to know what went wrong.
Now the problem is that when i supply the jacobian from numdifftools (the difference is in the second parameter (p[1]) the optimization fails but when I supply that obtained using autograd the optimization succeeds. I would like to know what the problem is.
The text was updated successfully, but these errors were encountered:
The problem here is that the function you are differentiating is very steep around the second parameter and therefore numdifftools fails. Numdifftools is only reliable on smooth slowly varying functions.
The only way to succed in this case is to limit the steps taken.
If you limit the maximum stepsize to 0.001 like this:
you will get the wanted result:
[-6.42302368e-01 1.52096259e+05 -1.07751884e-01 -2.99510421e-02]
[-6.42302368e-01 1.52096259e+05 -1.07751884e-01 -2.99510421e-02]
I tried to compare the values obtained from computing the jacobian of a complex valued function using numdifftools and found that it differed from that computed using autograd only for the parameter having 1j. i would like to know what went wrong.
Now the problem is that when i supply the jacobian from numdifftools (the difference is in the second parameter (p[1]) the optimization fails but when I supply that obtained using autograd the optimization succeeds. I would like to know what the problem is.
The text was updated successfully, but these errors were encountered: