Skip to content

Commit

Permalink
rem multistartopt from docs project fro now
Browse files Browse the repository at this point in the history
  • Loading branch information
Vaibhavdixit02 committed Sep 19, 2024
1 parent f29d5d8 commit bc2eb0a
Show file tree
Hide file tree
Showing 4 changed files with 6 additions and 9 deletions.
2 changes: 0 additions & 2 deletions docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,6 @@ OptimizationGCMAES = "6f0a0517-dbc2-4a7a-8a20-99ae7f27e911"
OptimizationMOI = "fd9f6733-72f4-499f-8506-86b2bdd0dea1"
OptimizationManopt = "e57b7fff-7ee7-4550-b4f0-90e9476e9fb6"
OptimizationMetaheuristics = "3aafef2f-86ae-4776-b337-85a36adf0b55"
OptimizationMultistartOptimization = "e4316d97-8bbb-4fd3-a7d8-3851d2a72823"
OptimizationNLPModels = "064b21be-54cf-11ef-1646-cdfee32b588f"
OptimizationNLopt = "4e6fcdb7-1186-4e1f-a706-475e75c168bb"
OptimizationNOMAD = "2cab0595-8222-4775-b714-9828e6a9e01b"
Expand Down Expand Up @@ -67,7 +66,6 @@ OptimizationGCMAES = "0.3"
OptimizationMOI = "0.5"
OptimizationManopt = "0.0.4"
OptimizationMetaheuristics = "0.3"
OptimizationMultistartOptimization = "0.3"
OptimizationNLPModels = "0.0.2"
OptimizationNLopt = "0.3"
OptimizationNOMAD = "0.3"
Expand Down
4 changes: 2 additions & 2 deletions docs/src/optimization_packages/multistartoptimization.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ constraint equations. However, lower and upper constraints set by `lb` and `ub`

The Rosenbrock function can be optimized using `MultistartOptimization.TikTak()` with 100 initial points and the local method `NLopt.LD_LBFGS()` as follows:

```@example MultiStart
```julia
using Optimization, OptimizationMultistartOptimization, OptimizationNLopt
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
x0 = zeros(2)
Expand All @@ -43,7 +43,7 @@ sol = solve(prob, MultistartOptimization.TikTak(100), NLopt.LD_LBFGS())

You can use any `Optimization` optimizers you like. The global method of the `MultistartOptimization` is a positional argument and followed by the local method. For example, we can perform a multistartoptimization with LBFGS as the optimizer using either the `NLopt.jl` or `Optim.jl` implementation as follows. Moreover, this interface allows you to access and adjust all the optimizer settings as you normally would:

```@example MultiStart
```julia
using OptimizationOptimJL
f = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0, -1.0], ub = [1.0, 1.0])
Expand Down
2 changes: 1 addition & 1 deletion docs/src/optimization_packages/optimization.md
Original file line number Diff line number Diff line change
Expand Up @@ -89,4 +89,4 @@ optf = OptimizationFunction(loss, AutoZygote())
prob = OptimizationProblem(optf, ps_ca, data)
res = Optimization.solve(prob, Optimization.Sophia(), callback = callback)
```
```
7 changes: 3 additions & 4 deletions docs/src/tutorials/minibatch.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,13 +67,12 @@ k = 10
train_loader = MLUtils.DataLoader((ode_data, t), batchsize = k)
numEpochs = 300
l1 = loss_adjoint(pp, train_loader.data[1], train_loader.data[2])[1]
l1 = loss_adjoint(pp, train_loader.data)[1]
optfun = OptimizationFunction(
loss_adjoint,
Optimization.AutoZygote())
optprob = OptimizationProblem(optfun, pp)
optprob = OptimizationProblem(optfun, ps_ca, train_loader)
using IterTools: ncycle
res1 = Optimization.solve(optprob, Optimisers.ADAM(0.05), ncycle(train_loader, numEpochs),
callback = callback)
res1 = Optimization.solve(optprob, Optimisers.ADAM(0.05); callback = callback, epochs = 1000)
```

0 comments on commit bc2eb0a

Please sign in to comment.