-
Notifications
You must be signed in to change notification settings - Fork 5
integration with scipy.optimize
Add a top-level public wrapper with the signature
def least_squares(func_and_jac, x0, bounds=None, method='trf', jac=None,
ftol=1e-5, xtol=1e-5, gtol=1e-3, max_nfev=1000, scaling=1.0):
# input validation etc
# construct the trampoline for func_and_jac `if jac is not None`, see below
# manipulate the tolerances, if desired
meths = {'trf': _lstsq_trf(...),
'dogbox': _lstsq_dogbox(...),
}
result = meths[method]
return result
Recognized method
s are 'trf', 'dogbox'. I'm +0 on adding method='LM'
to wrap leastsq
. An option to accept an arbitrary callable was added to
optimize.minimize
a while ago. I'd say keep it a future enhancement.
jac=None
means that func_and_jac
returns both. Otherwise, jac
is a callable
and func_and_jac
only returns the function value.
Order of arguments is made to make the default usage as light as possible:
least_squares(func, x0, bounds)
least_squares(func, x0, bounds, method='dogbox')
least_squares(func, x0, bound, jac=my_jac_impl) # ok, can use a kwarg
A callback
argument is present is some minimizers. I'd say keep it as an optional
enhancement unless we see an immediate use for it (I don't).
This seems general enough to allow future tweaks for large-scale methods.
================================
Also, add the kwarg to the method='LM'
convenience wrapper curve_fit
.
================================
Strategy:
-
Send a PR with the private computational methods. State clearly that public interface is a matter for a separate PR.
-
Send a separate PR adding the
least_squares
wrapper. Can do almost immediately: clone the branch with the private routines, add the wrapper, send a WIP PR stating that it is depends on the previous one. -
Using the same method, send a third PR adding the new methods to
curve_fit
.