Skip to content

Commit

Permalink
Merge pull request #221 from sathvikbhagavan/sb/adam
Browse files Browse the repository at this point in the history
docs: use `Adam` instead of `ADAM` in missing_physics tutorial
  • Loading branch information
ChrisRackauckas authored Mar 1, 2024
2 parents 735a9e6 + 0e61db3 commit 8b2267b
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions docs/src/showcase/missing_physics.md
Original file line number Diff line number Diff line change
Expand Up @@ -258,7 +258,7 @@ Thus we first solve the optimization problem with ADAM. Choosing a learning rate
(tuned to be as high as possible that doesn't tend to make the loss shoot up), we see:

```@example ude
res1 = Optimization.solve(optprob, ADAM(), callback = callback, maxiters = 5000)
res1 = Optimization.solve(optprob, OptimizationOptimisers.Adam(), callback = callback, maxiters = 5000)
println("Training loss after $(length(losses)) iterations: $(losses[end])")
```

Expand All @@ -267,7 +267,7 @@ second optimization, and run it with BFGS. This looks like:

```@example ude
optprob2 = Optimization.OptimizationProblem(optf, res1.u)
res2 = Optimization.solve(optprob2, Optim.LBFGS(linesearch = BackTracking()), callback = callback, maxiters = 1000)
res2 = Optimization.solve(optprob2, LBFGS(linesearch = BackTracking()), callback = callback, maxiters = 1000)
println("Final training loss after $(length(losses)) iterations: $(losses[end])")
# Rename the best candidate
Expand Down

0 comments on commit 8b2267b

Please sign in to comment.