-
-
Notifications
You must be signed in to change notification settings - Fork 85
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #788 from SciML/newdocs
New tutorials and docs
- Loading branch information
Showing
8 changed files
with
269 additions
and
115 deletions.
There are no files selected for viewing
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,39 @@ | ||
# Optimization.jl | ||
|
||
There are some solvers that are available in the Optimization.jl package directly without the need to install any of the solver wrappers. | ||
|
||
## Methods | ||
|
||
`LBFGS`: The popular quasi-Newton method that leverages limited memory BFGS approximation of the inverse of the Hessian. Through a wrapper over the [L-BFGS-B](https://users.iems.northwestern.edu/%7Enocedal/lbfgsb.html) fortran routine accessed from the [LBFGSB.jl](https://github.com/Gnimuc/LBFGSB.jl/) package. It directly supports box-constraints. | ||
|
||
This can also handle arbitrary non-linear constraints through a Augmented Lagrangian method with bounds constraints described in 17.4 of Numerical Optimization by Nocedal and Wright. Thus serving as a general-purpose nonlinear optimization solver available directly in Optimization.jl. | ||
|
||
## Examples | ||
|
||
### Unconstrained rosenbrock problem | ||
|
||
```@example L-BFGS | ||
using Optimization, Zygote | ||
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2 | ||
x0 = zeros(2) | ||
p = [1.0, 100.0] | ||
optf = OptimizationFunction(rosenbrock, AutoZygote()) | ||
prob = Optimization.OptimizationProblem(optf, x0, p) | ||
sol = solve(prob, Optimization.LBFGS()) | ||
``` | ||
|
||
### With nonlinear and bounds constraints | ||
|
||
```@example L-BFGS | ||
function con2_c(res, x, p) | ||
res .= [x[1]^2 + x[2]^2, (x[2] * sin(x[1]) + x[1]) - 5] | ||
end | ||
optf = OptimizationFunction(rosenbrock, AutoZygote(), cons = con2_c) | ||
prob = OptimizationProblem(optf, x0, p, lcons = [1.0, -Inf], | ||
ucons = [1.0, 0.0], lb = [-1.0, -1.0], | ||
ub = [1.0, 1.0]) | ||
res = solve(prob, Optimization.LBFGS(), maxiters = 100) | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,49 @@ | ||
# Using SymbolicAnalysis.jl for convexity certificates | ||
|
||
In this tutorial, we will show how to use automatic convexity certification of the optimization problem using [SymbolicAnalysis.jl](https://github.com/Vaibhavdixit02/SymbolicAnalysis.jl). | ||
|
||
This works with the `structural_analysis` keyword argument to `OptimizationProblem`. This tells the package to try to trace through the objective and constraints with symbolic variables (for more details on this look at the [Symbolics documentation](https://symbolics.juliasymbolics.org/stable/manual/functions/#function_registration)). This relies on the Disciplined Programming approach hence neccessitates the use of "atoms" from the SymbolicAnalysis.jl package. | ||
|
||
We'll use a simple example to illustrate the convexity structure certification process. | ||
|
||
```@example symanalysis | ||
using SymbolicAnalysis, Zygote, LinearAlgebra, Optimization, OptimizationMOI | ||
function f(x, p = nothing) | ||
return exp(x[1]) + x[1]^2 | ||
end | ||
optf = OptimizationFunction(f, Optimization.AutoForwardDiff()) | ||
prob = OptimizationProblem(optf, [0.4], structural_analysis = true) | ||
sol = solve(prob, Optimization.LBFGS(), maxiters = 1000) | ||
``` | ||
|
||
The result can be accessed as the `analysis_results` field of the solution. | ||
|
||
```@example symanalysis | ||
sol.cache.analysis_results.objective | ||
``` | ||
|
||
Relatedly you can enable structural analysis in Riemannian optimization problems (supported only on the SPD manifold). | ||
|
||
We'll look at the Riemannian center of mass of SPD matrices which is known to be a Geodesically Convex problem on the SPD manifold. | ||
|
||
```@example symanalysis | ||
using Optimization, OptimizationManopt, Symbolics, Manifolds, Random, LinearAlgebra, | ||
SymbolicAnalysis | ||
M = SymmetricPositiveDefinite(5) | ||
m = 100 | ||
σ = 0.005 | ||
q = Matrix{Float64}(LinearAlgebra.I(5)) .+ 2.0 | ||
data2 = [exp(M, q, σ * rand(M; vector_at = q)) for i in 1:m]; | ||
f(x, p = nothing) = sum(SymbolicAnalysis.distance(M, data2[i], x)^2 for i in 1:5) | ||
optf = OptimizationFunction(f, Optimization.AutoZygote()) | ||
prob = OptimizationProblem(optf, data2[1]; manifold = M, structural_analysis = true) | ||
opt = OptimizationManopt.GradientDescentOptimizer() | ||
sol = solve(prob, opt, maxiters = 100) | ||
``` |
Oops, something went wrong.