Numerical optimization/minimization

# optipy

optipy contains a generic optimization/minimization method. Its creation was motivated by the absence of an implementation of Newton's method with a custom Hessian solver in SciPy (see this bug).

The mandatory Rosenbrock example:

```a = 1.0
b = 100.0

def fun(x):
return (a-x[0])**2 + b*(x[1] - x[0]**2)**2

def jac(x):
return numpy.array([
-2*(a-x[0]) - 4*b*(x[1] - x[0]**2) * x[0],
2*b*(x[1] - x[0]**2)
])

hess = numpy.array([
[2 + 8*b*x[0]**2 - 4*b*(x[1] - x[0]**2), -4*b*x[0]],
[-4*b*x[0], 2*b]
])

sol = optipy.minimize(
fun=fun,
x0=[-1.0, 3.5],
jac=jac,
get_search_direction=hess_inv,
atol=1.0e-5
)
```

This is basically the exact Newton method. When setting `get_search_direction` to `lambda x, grad: -grad`, one gets the steepest descent method. For larger computations, one will typically replace this with a tailored solver, e.g., a preconditioned Krylov solver.

The return type is largely compatible with SciPy's generic return type, OptmizeResult.

### Installation

optipy is available from the Python Package Index, so simply do

``````pip install -U optipy
``````

to install or upgrade. Use `sudo -H` to install as root or the `--user` option of `pip` to install in `\$HOME`.

### Testing

To run the optipy unit tests, check out this repository and type

``````pytest
``````

### Distribution

To create a new release

1. bump the `__version__` number,

2. publish to PyPi and tag on GitHub:

``````\$ make publish
``````

optipy is published under the MIT license.