You totally can. Just use jac=True
:
In [1]: import numpy as np
In [2]: from scipy.optimize import minimize
In [3]: def f_and_grad(x):
...: return x**2, 2*x
...:
In [4]: minimize(f_and_grad, [1], jac=True)
Out[4]:
fun: 1.8367099231598242e-40
hess_inv: array([[ 0.5]])
jac: array([ 2.71050543e-20])
message: 'Optimization terminated successfully.'
nfev: 4
nit: 2
njev: 4
status: 0
success: True
x: array([ 1.35525272e-20])
It's actually documented:
jac : bool or callable, optional Jacobian (gradient) of objective
function. Only for CG, BFGS, Newton-CG, L-BFGS-B, TNC, SLSQP, dogleg,
trust-ncg. If jac is a Boolean and is True, fun is assumed to return
the gradient along with the objective function. If False, the gradient
will be estimated numerically. jac can also be a callable returning
the gradient of the objective. In this case, it must accept the same
arguments as fun.
(emphasis mine)
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…