Skip to content

Commit

Permalink
docs: use BackTracking with LBFGS for missing physics ude showcase
Browse files Browse the repository at this point in the history
  • Loading branch information
sathvikbhagavan committed Jan 30, 2024
1 parent ec8f4d7 commit a8dbada
Showing 1 changed file with 5 additions and 4 deletions.
9 changes: 5 additions & 4 deletions docs/src/showcase/missing_physics.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,8 @@ are and how they are used. For the neural network training:
| [SciMLSensitivity.jl](https://docs.sciml.ai/SciMLSensitivity/stable/) | The adjoint methods, defines gradients of ODE solvers |
| [Optimization.jl](https://docs.sciml.ai/Optimization/stable/) | The optimization library |
| [OptimizationOptimisers.jl](https://docs.sciml.ai/Optimization/stable/optimization_packages/optimisers/) | The optimization solver package with `Adam` |
| [OptimizationOptimJL.jl](https://docs.sciml.ai/Optimization/stable/optimization_packages/optim/) | The optimization solver package with `BFGS` |
| [OptimizationOptimJL.jl](https://docs.sciml.ai/Optimization/stable/optimization_packages/optim/) | The optimization solver package with `LBFGS` |
| [LineSearches.jl](https://julianlsolvers.github.io/LineSearches.jl/latest/index.html) | Line search algorithms package to be used with `LBFGS`|

For the symbolic model discovery:

Expand Down Expand Up @@ -61,7 +62,7 @@ And external libraries:
```@example ude
# SciML Tools
using OrdinaryDiffEq, ModelingToolkit, DataDrivenDiffEq, SciMLSensitivity, DataDrivenSparse
using Optimization, OptimizationOptimisers, OptimizationOptimJL
using Optimization, OptimizationOptimisers, OptimizationOptimJL, LineSearches
# Standard Libraries
using LinearAlgebra, Statistics
Expand Down Expand Up @@ -266,10 +267,10 @@ second optimization, and run it with BFGS. This looks like:

```@example ude
optprob2 = Optimization.OptimizationProblem(optf, res1.u)
res2 = Optimization.solve(optprob2, Optim.LBFGS(), callback = callback, maxiters = 1000)
res2 = Optimization.solve(optprob2, Optim.LBFGS(linesearch = BackTracking()), callback = callback, maxiters = 1000)
println("Final training loss after $(length(losses)) iterations: $(losses[end])")
# Rename the best candidate
# Rename the best candidate
p_trained = res2.u
```

Expand Down

0 comments on commit a8dbada

Please sign in to comment.