One of the core libraries for nonlinear optimization is Optim.jl. Optim.jl is a lot like the standard optimizers you'd find in SciPy or MATLAB. You give it a function and it finds the minimum. For example, if you give it a univariate function it uses Brent's method to find the minimum in an interval:
using Optim
f(x) = sin(x)+cos(x)
Optim.optimize(f,0.0,2π) # Find a minimum between 0 and 2π
If you give it a function which requires vector input with scalar output, it will give the vector local minima:
f(x) = sin(x[1])+cos(x[1]+x[2])
Optim.optimize(f,zeros(2)) # Find a minimum starting at [0.0,0.0]
You can refer to Optim's large library of methods and pass in a method choice to have different properties. Let's choose BFGS:
Optim.optimize(f,zeros(2),BFGS())
Global optimization is provided with a native Julia implementation at BlackBoxOptim.jl. You have to give it box constraints and tell it the size of the input vector:
using BlackBoxOptim
function rosenbrock2d(x)
return (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
end
res = bboptimize(rosenbrock2d; SearchRange = (-5.0, 5.0), NumDimensions = 2)
JuMP.jl is a large library for all sorts of optimization problems. It has solvers for linear, quadratic, etc. programming problems. If you're not doing nonlinear optimization JuMP is a great choice. If you're looking to do convex programming, Convex.jl is a library with methods specific for this purpose. If you want to do nonlinear optimization with constraints, NLopt.jl is a library with a large set of choices. It also has a bunch of derivative-free local optimization methods. It's only issue is that its an interface to a C library and can be more difficult to debug than the native Julia codes, but otherwise it's a great alternative to Optim and BlackBoxOptim.
Use Optim.jl to optimize Hosaki's Function. Use the initial condition [2.0,2.0]
.
BlackBoxOptim.jl to find global minima of the Adjiman Function with $-1 < x_1 < 2$ and $-1 < x_2 < 1$.