From a8dbadaf8adc7e4be110869816af53e94f9af7bc Mon Sep 17 00:00:00 2001 From: Sathvik Bhagavan Date: Tue, 30 Jan 2024 06:38:43 +0000 Subject: [PATCH] docs: use BackTracking with LBFGS for missing physics ude showcase --- docs/src/showcase/missing_physics.md | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/docs/src/showcase/missing_physics.md b/docs/src/showcase/missing_physics.md index 0262e6046fd..a00f32ff085 100644 --- a/docs/src/showcase/missing_physics.md +++ b/docs/src/showcase/missing_physics.md @@ -24,7 +24,8 @@ are and how they are used. For the neural network training: | [SciMLSensitivity.jl](https://docs.sciml.ai/SciMLSensitivity/stable/) | The adjoint methods, defines gradients of ODE solvers | | [Optimization.jl](https://docs.sciml.ai/Optimization/stable/) | The optimization library | | [OptimizationOptimisers.jl](https://docs.sciml.ai/Optimization/stable/optimization_packages/optimisers/) | The optimization solver package with `Adam` | -| [OptimizationOptimJL.jl](https://docs.sciml.ai/Optimization/stable/optimization_packages/optim/) | The optimization solver package with `BFGS` | +| [OptimizationOptimJL.jl](https://docs.sciml.ai/Optimization/stable/optimization_packages/optim/) | The optimization solver package with `LBFGS` | +| [LineSearches.jl](https://julianlsolvers.github.io/LineSearches.jl/latest/index.html) | Line search algorithms package to be used with `LBFGS`| For the symbolic model discovery: @@ -61,7 +62,7 @@ And external libraries: ```@example ude # SciML Tools using OrdinaryDiffEq, ModelingToolkit, DataDrivenDiffEq, SciMLSensitivity, DataDrivenSparse -using Optimization, OptimizationOptimisers, OptimizationOptimJL +using Optimization, OptimizationOptimisers, OptimizationOptimJL, LineSearches # Standard Libraries using LinearAlgebra, Statistics @@ -266,10 +267,10 @@ second optimization, and run it with BFGS. This looks like: ```@example ude optprob2 = Optimization.OptimizationProblem(optf, res1.u) -res2 = Optimization.solve(optprob2, Optim.LBFGS(), callback = callback, maxiters = 1000) +res2 = Optimization.solve(optprob2, Optim.LBFGS(linesearch = BackTracking()), callback = callback, maxiters = 1000) println("Final training loss after $(length(losses)) iterations: $(losses[end])") -# Rename the best candidate +# Rename the best candidate p_trained = res2.u ```