Minimizing the Rosenbrock function #21
adtzlr
started this conversation in
Show and tell
Replies: 1 comment
-
Starting with import tensortrax as tr
import tensortrax.math as tm
import numpy as np
def rosen(x):
return tm.sum(100*(x[1:] - x[:-1]**2)**2 + (1 - x[:-1])**2)
x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2])
func = tr.function(rosen)
jac = tr.jacobian(rosen)
hessp = lambda x, v: tr.hessian_vector_product(rosen)(x, δx=v)
from scipy.optimize import minimize
res = minimize(func, x0, method="Newton-CG", jac=jac, hessp=hessp) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Let us consider the problem of minimizing the Rosenbrock function (taken from [1]). With
tensortrax
, the gradient and the hessian are evaluated by forward-mode AD.References
[1] SciPy Documentation of
scipy.optimize.minimize
, https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.htmlBeta Was this translation helpful? Give feedback.
All reactions