-
-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update #711 checking if OptimizationFunction is used for derivative based optimizers #711 #715
Conversation
Adding trait for requiring derivatives and throwing error to use OptimizationFunction
Update OptimizationOptimJL.jl
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be set for all the packages in lib/ (i.e. the solver wrappers)
Updated the trait requiresderivative to requiresgradient.
Added requiresgradient, requireshessian, requiresconshess and requiresconsjac traits.
Added new traits: requiresgradient, requireshessian, requiresconsjac and requiresconshess.
Added the traits requiresgradient, requireshessian, requiresconsjac and requiresconshess.
Added traits requireshessian, requiresgradient, requiresconshess and requiresconsjac.
Added traits requireshessian, requiresgradient, requiresconsjac and requiresconshess.
@@ -14,6 +14,7 @@ SciMLBase.requiresbounds(opt::Optim.SAMIN) = true | |||
SciMLBase.supports_opt_cache_interface(opt::Optim.AbstractOptimizer) = true | |||
SciMLBase.supports_opt_cache_interface(opt::Union{Optim.Fminbox, Optim.SAMIN}) = true | |||
SciMLBase.supports_opt_cache_interface(opt::Optim.ConstrainedOptimizer) = true | |||
SciMLBase.requiresgradient(Optim.AbstractOptimizer) = !(opt isa Optim.ZerothOrderOptimizer) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, but this wrapper will need more of the traits set to true (Newton, NewtonTrustRegion and IPNewton)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And not all of them follow the abstract type, especially check the SAMIN and FminBox algorithms
@@ -9,6 +9,11 @@ abstract type BBO end | |||
SciMLBase.requiresbounds(::BBO) = true | |||
SciMLBase.allowsbounds(::BBO) = true | |||
SciMLBase.supports_opt_cache_interface(opt::BBO) = true | |||
SciMLBase.requiresgradient(::BBO) = false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since the default value is false, you can only declare the ones you need to set to true here, so for example you'd not need to define anything here in OptimzationBBO since its all false
Added traits for IPNewton, Newton, and NewtonTrustRegion.
Added requiresgradient trait to NLOPT.jl
Set trait requiresgradient to true.
@@ -8,6 +8,7 @@ using Optimization.SciMLBase | |||
|
|||
SciMLBase.allowsbounds(opt::Union{NLopt.Algorithm, NLopt.Opt}) = true | |||
SciMLBase.supports_opt_cache_interface(opt::Union{NLopt.Algorithm, NLopt.Opt}) = true | |||
SciMLBase.requiresgradient(opt::Union{NLopt.Algorithm, NLopt.Opt}) = true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be useful to check if NLopt has a type hierarchy we can use here for doing this better, since it has all kinds of algorithms anything we put here will be wrong - in which case I would want to set the hessian to true as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about if I set the traits requiresgradient and requireshessian true for only the gradient-based algorithms (local and global). Only the derivative-free local optimizers will have these traits as false then.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's what I am hoping we can do yeah. But you should also be able to set the requireshessian to true only for second order methods by looking at the docs of nlopt. For the derivative-free vs others you plan to use the prefix in the names of the methods?
@@ -128,7 +136,8 @@ function SciMLBase.__solve(cache::OptimizationCache{ | |||
local x, cur, state | |||
|
|||
cur, state = iterate(cache.data) | |||
|
|||
(requiresgradient(Optim.AbstractOptimizer)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be done in the solve
defined in SciMLBase, I'll point you to it later once this PR is done, you can remove this error from here
Set traits requiresgradient and requiresconsjac to true for certain algorithms.
Set trait requiresgradient to true.
Set trait requiresgradient to true.
Removed redundant traits declared.
Set traits requiresgradient, requireshessian and requiresconsjac to true based on the algorithm.
Updated traits for Newton and NewtonTrustRegion as they are second order, they automatically require the calculation of hessians.
Updated traits.
Set the requiresgradient for Fminbox to true as it is required for the objective function evaluation and constraint handling.
Set traits requiresgradient, requireshessian, requiresconshess and requiresconsjac to true.
Removed redundant check for requiresgradient trait.
@ParasPuneetSingh there's a few errors here please fix them from the CI run it seems like syntax issues |
Corrected syntax errors.
Fixed syntax errors.
I checked the Cl run and corrected the syntax errors in OptimizationNLopt.jl but now it says that "Optim.AbstractOptimizer" is not a valid function argument name, but it has been used for other traits as well, I don't get this syntax error. |
@@ -4,6 +4,8 @@ using Reexport | |||
@reexport using Optimization | |||
using Optimization.SciMLBase, OptimizationOptimJL, OptimizationOptimisers | |||
|
|||
SciMLBase.requiresgradient(opt::PolyOpt) = true | |||
|
|||
struct PolyOpt end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PolyOpt is defined after the trait call switch around the order
Interchanged the struct definition and trait checking.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change the compat bound on SciMLBase to 2.30.0 then this should be ready to merge
Updated compat for SciMLBase
@@ -64,7 +64,7 @@ OptimizationPolyalgorithms = "0.1, 0.2" | |||
OptimizationSpeedMapping = "0.1, 0.2" | |||
OrdinaryDiffEq = "6" | |||
ReverseDiff = ">= 1.9.0" | |||
SciMLBase = "2" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not just in docs also the Project.toml in the base directory of the repo
Updated SciMLBase compat to 2.30.0
Added checking if OptimizationFunction is used for derivative based optimizers #711