diff --git a/docs/src/interfaces/Init_Solve.md b/docs/src/interfaces/Init_Solve.md index 840749c7f..66cd2b524 100644 --- a/docs/src/interfaces/Init_Solve.md +++ b/docs/src/interfaces/Init_Solve.md @@ -37,7 +37,7 @@ is distinctly different from the [LinearSolve init interface](https://docs.sciml.ai/LinearSolve/stable/tutorials/caching_interface) which is designed for caching efficiency with reusing factorizations. -## __solve and High-Level Handling +## `__solve` and High-Level Handling While `init` and `solve` are the common entry point for users, solver packages will mostly define dispatches on `SciMLBase.__init` and `SciMLBase.__solve`. The reason is diff --git a/docs/src/interfaces/Problems.md b/docs/src/interfaces/Problems.md index 3423d5a2a..db253e41a 100644 --- a/docs/src/interfaces/Problems.md +++ b/docs/src/interfaces/Problems.md @@ -86,7 +86,7 @@ usage, a `AbstractSciMLProblem` might be associated with some solver configurati callback or tolerance. Thus, for flexibility the extra keyword arguments to the `AbstractSciMLProblem` are carried to the solver. -### problem_type +### `problem_type` `AbstractSciMLProblem` types include a non-public API definition of `problem_type` which holds a trait type corresponding to the way the `AbstractSciMLProblem` was constructed. For example, diff --git a/docs/src/interfaces/SciMLFunctions.md b/docs/src/interfaces/SciMLFunctions.md index 9f062bff7..8cad66f73 100644 --- a/docs/src/interfaces/SciMLFunctions.md +++ b/docs/src/interfaces/SciMLFunctions.md @@ -94,8 +94,8 @@ on setting up time/parameter dependent operators. The solver libraries internally use packages such as [FiniteDiff.jl](https://docs.sciml.ai/FiniteDiff/stable/) and [SparseDiffTools.jl](https://docs.sciml.ai/SparseDiffTools/stable/) for high performance calculation of sparse Jacobians and Hessians, along with matrix-free -calculations of Jacobian-Vector products (J*v), vector-Jacobian products (v'*J), -and Hessian-vector products (H*v). The SciML interface gives users the ability +calculations of Jacobian-Vector products (`J*v`), vector-Jacobian products (`v'*J`), +and Hessian-vector products (`H*v`). The SciML interface gives users the ability to control these connections in order to allow for top notch performance. The key arguments in the SciMLFunction is the `prototype`, which is an object diff --git a/src/SciMLBase.jl b/src/SciMLBase.jl index 595d0be35..9a021fceb 100644 --- a/src/SciMLBase.jl +++ b/src/SciMLBase.jl @@ -100,7 +100,7 @@ abstract type AbstractOptimizationCache end """ $(TYPEDEF) -Base for types which define nonlinear solve problems (f(u)=0). +Base for types which define nonlinear solve problems (`f(u)=0`). """ abstract type AbstractNonlinearProblem{uType, isinplace} <: AbstractDEProblem end abstract type AbstractIntervalNonlinearProblem{uType, isinplace} <: diff --git a/src/alg_traits.jl b/src/alg_traits.jl index 9f1b94246..0b86ad67a 100644 --- a/src/alg_traits.jl +++ b/src/alg_traits.jl @@ -1,5 +1,5 @@ """ -isautodifferentiable(alg::AbstractDEAlgorithm) + isautodifferentiable(alg::AbstractDEAlgorithm) Trait declaration for whether an algorithm is compatible with direct automatic differentiation, i.e. can have algorithms like @@ -11,7 +11,7 @@ Defaults to false as only pure-Julia algorithms can have this be true. isautodifferentiable(alg::AbstractSciMLAlgorithm) = false """ -forwarddiffs_model(alg::AbstractDEAlgorithm) + forwarddiffs_model(alg::AbstractDEAlgorithm) Trait declaration for whether an algorithm uses ForwardDiff.jl on the model function is called with ForwardDiff.jl @@ -21,7 +21,7 @@ Defaults to false as only pure-Julia algorithms can have this be true. forwarddiffs_model(alg::AbstractSciMLAlgorithm) = false """ -forwarddiffs_model_time(alg::AbstractDEAlgorithm) + forwarddiffs_model_time(alg::AbstractDEAlgorithm) Trait declaration for whether an algorithm uses ForwardDiff.jl on the model `f(u,p,t)` function is called with ForwardDiff.jl on the `t` argument. @@ -32,7 +32,7 @@ have this as true forwarddiffs_model_time(alg::AbstractSciMLAlgorithm) = false """ -allows_arbitrary_number_types(alg::AbstractDEAlgorithm) + allows_arbitrary_number_types(alg::AbstractDEAlgorithm) Trait declaration for whether an algorithm is compatible with direct automatic differentiation, i.e. can have algorithms like @@ -44,7 +44,7 @@ Defaults to false as only pure-Julia algorithms can have this be true. allows_arbitrary_number_types(alg::AbstractSciMLAlgorithm) = false """ -allowscomplex(alg::AbstractDEAlgorithm) + allowscomplex(alg::AbstractDEAlgorithm) Trait declaration for whether an algorithm is compatible with having complex numbers as the state variables. @@ -54,7 +54,7 @@ Defaults to false. allowscomplex(alg::AbstractSciMLAlgorithm) = false """ -isadaptive(alg::AbstractDEAlgorithm) + isadaptive(alg::AbstractDEAlgorithm) Trait declaration for whether an algorithm uses adaptivity, i.e. has a non-quasi-static compute graph. @@ -65,7 +65,7 @@ isadaptive(alg::AbstractDEAlgorithm) = true # Default to assuming adaptive, safer error("Adaptivity algorithm trait not set.") """ -isdiscrete(alg::AbstractDEAlgorithm) + isdiscrete(alg::AbstractDEAlgorithm) Trait declaration for whether an algorithm allows for discrete state values, such as integers. @@ -75,7 +75,7 @@ Defaults to false. isdiscrete(alg::AbstractDEAlgorithm) = false """ -allowsbounds(opt) + allowsbounds(opt) Trait declaration for whether an optimizer allows for box constraints passed with `lb` and `ub` in @@ -86,7 +86,7 @@ Defaults to false. allowsbounds(opt) = false """ -requiresbounds(opt) + requiresbounds(opt) Trait declaration for whether an optimizer requires box constraints passed with `lb` and `ub` in @@ -97,7 +97,7 @@ Defaults to false. requiresbounds(opt) = false """ -allowsconstraints(opt) + allowsconstraints(opt) Trait declaration for whether an optimizer allows non-linear constraints specified in `cons` in @@ -108,7 +108,7 @@ Defaults to false. allowsconstraints(opt) = false """ -requiresconstraints(opt) + requiresconstraints(opt) Trait declaration for whether an optimizer requires non-linear constraints specified in @@ -119,7 +119,7 @@ Defaults to false. requiresconstraints(opt) = false """ -requiresgradient(opt) + requiresgradient(opt) Trait declaration for whether an optimizer requires gradient in `instantiate_function`. @@ -129,7 +129,7 @@ Defaults to false. requiresgradient(opt) = false """ -requireshessian(opt) + requireshessian(opt) Trait declaration for whether an optimizer requires hessian in `instantiate_function`. @@ -139,17 +139,17 @@ Defaults to false. requireshessian(opt) = false """ -requiresconsjac(opt) + requiresconsjac(opt) Trait declaration for whether an optimizer -requires cons_j in `instantiate_function`, that is, does the optimizer require a constant Jacobian. +requires `cons_j` in `instantiate_function`, that is, does the optimizer require a constant Jacobian. Defaults to false. """ requiresconsjac(opt) = false """ -requiresconshess(opt) + requiresconshess(opt) Trait declaration for whether an optimizer requires cons_h in `instantiate_function`, that is, does the optimizer require a constant hessian. @@ -159,7 +159,7 @@ Defaults to false. requiresconshess(opt) = false """ -allowscallback(opt) + allowscallback(opt) Trait declaration for whether an optimizer supports passing a `callback` to `solve` @@ -170,7 +170,7 @@ Defaults to true. allowscallback(opt) = true """ -alg_order(alg) + alg_order(alg) The theoretic convergence order of the algorithm. If the method is adaptive order, this is treated as the maximum order of the algorithm. diff --git a/src/debug.jl b/src/debug.jl index a9f7634d6..9cb5c5928 100644 --- a/src/debug.jl +++ b/src/debug.jl @@ -49,11 +49,11 @@ into the AbstractSciMLProblem (e.x.: ODEProblem) but the parameters object `p` w expression. Two common reasons for this issue are: 1. Forgetting to pass parameters into the problem constructor. For example, `ODEProblem(f,u0,tspan)` should -be `ODEProblem(f,u0,tspan,p)` in order to use parameters. + be `ODEProblem(f,u0,tspan,p)` in order to use parameters. 2. Using the wrong function signature. For example, with `ODEProblem`s the function signature is always -`f(du,u,p,t)` for the in-place form or `f(u,p,t)` for the out-of-place form. Note that the `p` argument -will always be in the function signature regardless of if the problem is defined with parameters! + `f(du,u,p,t)` for the in-place form or `f(u,p,t)` for the out-of-place form. Note that the `p` argument + will always be in the function signature regardless of if the problem is defined with parameters! """ function __init__() diff --git a/src/integrator_interface.jl b/src/integrator_interface.jl index eec89165b..6970b44d0 100644 --- a/src/integrator_interface.jl +++ b/src/integrator_interface.jl @@ -283,7 +283,7 @@ function reinit!(integrator::DEIntegrator, args...; kwargs...) end """ -initialize_dae!(integrator::DEIntegrator,initializealg = integrator.initializealg) + initialize_dae!(integrator::DEIntegrator,initializealg = integrator.initializealg) Runs the DAE initialization to find a consistent state vector. The optional argument `initializealg` can be used to specify a different initialization @@ -927,7 +927,7 @@ end has_stats(i::DEIntegrator) = false """ - is_integrator_adaptive(i::DEIntegrator) + isadaptive(i::DEIntegrator) Checks if the integrator is adaptive """ diff --git a/src/operators/diffeq_operator.jl b/src/operators/diffeq_operator.jl index b62c2bc5b..d31ebec3e 100644 --- a/src/operators/diffeq_operator.jl +++ b/src/operators/diffeq_operator.jl @@ -1,5 +1,5 @@ """ -AffineDiffEqOperator{T} <: AbstractDiffEqOperator{T} + AffineDiffEqOperator{T} <: AbstractDiffEqOperator{T} `Ex: (A₁(t) + ... + Aₙ(t))*u + B₁(t) + ... + Bₘ(t)` diff --git a/src/problems/problem_utils.jl b/src/problems/problem_utils.jl index 8eaa6f8bf..15462e8b1 100644 --- a/src/problems/problem_utils.jl +++ b/src/problems/problem_utils.jl @@ -149,11 +149,11 @@ into the AbstractSciMLProblem (e.x.: ODEProblem) but the parameters object `p` w expression (e.x. `p[i]`, or `x .+ p`). Two common reasons for this issue are: 1. Forgetting to pass parameters into the problem constructor. For example, `ODEProblem(f,u0,tspan)` should -be `ODEProblem(f,u0,tspan,p)` in order to use parameters. + be `ODEProblem(f,u0,tspan,p)` in order to use parameters. 2. Using the wrong function signature. For example, with `ODEProblem`s the function signature is always -`f(du,u,p,t)` for the in-place form or `f(u,p,t)` for the out-of-place form. Note that the `p` argument -will always be in the function signature regardless of if the problem is defined with parameters! + `f(du,u,p,t)` for the in-place form or `f(u,p,t)` for the out-of-place form. Note that the `p` argument + will always be in the function signature regardless of if the problem is defined with parameters! """ struct NullParameterIndexError <: Exception end diff --git a/src/retcodes.jl b/src/retcodes.jl index eacc71341..0910331b5 100644 --- a/src/retcodes.jl +++ b/src/retcodes.jl @@ -1,5 +1,5 @@ """ -`SciML.ReturnCode` + SciML.ReturnCode `SciML.ReturnCode` is the standard return code enum interface for the SciML interface. Return codes are notes given by the solvers to indicate the state of the solution, for @@ -31,7 +31,7 @@ did not error. """ EnumX.@enumx ReturnCode begin """ - `ReturnCode.Default` + ReturnCode.Default The default state of the solver. If this return code is given, then the solving process is either still in process or the solver library has not been setup @@ -49,12 +49,12 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ Default """ - `ReturnCode.Success` + ReturnCode.Success The success state of the solver. If this return code is given, then the solving process was successful, but no extra information about that success is given. @@ -66,12 +66,12 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = true + - `successful_retcode` = `true` """ Success """ - `ReturnCode.Terminated` + ReturnCode.Terminated The successful termination state of the solver. If this return code is given, then the solving process was successful at terminating the solve, usually @@ -90,12 +90,12 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = true + - `successful_retcode` = `true` """ Terminated """ - `ReturnCode.DtNaN` + ReturnCode.DtNaN A failure exit state of the solver. If this return code is given, then the solving process was unsuccessful and exited early because the `dt` of the @@ -113,12 +113,12 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ DtNaN """ - `ReturnCode.MaxIters` + ReturnCode.MaxIters A failure exit state of the solver. If this return code is given, then the solving process was unsuccessful and exited early because the solver's @@ -145,12 +145,12 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ MaxIters """ - `ReturnCode.DtLessThanMin` + ReturnCode.DtLessThanMin A failure exit state of the solver. If this return code is given, then the solving process was unsuccessful and exited early because the `dt` of the @@ -175,12 +175,12 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ DtLessThanMin """ - `ReturnCode.Unstable` + ReturnCode.Unstable A failure exit state of the solver. If this return code is given, then the solving process was unsuccessful and exited early because the `unstable_check` @@ -194,12 +194,12 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ Unstable """ - `ReturnCode.InitialFailure` + ReturnCode.InitialFailure A failure exit state of the solver. If this return code is given, then the solving process was unsuccessful because the initialization process failed. @@ -219,12 +219,12 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ InitialFailure """ - `ReturnCode.ConvergenceFailure` + ReturnCode.ConvergenceFailure A failure exit state of the solver. If this return code is given, then the solving process was unsuccessful because internal nonlinear solver iterations @@ -242,12 +242,12 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ ConvergenceFailure """ - `ReturnCode.Failure` + ReturnCode.Failure A failure exit state of the solver. If this return code is given, then the solving process was unsuccessful but no extra information is given. @@ -260,12 +260,12 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ Failure """ - `ReturnCode.ExactSolutionLeft` + ReturnCode.ExactSolutionLeft The success state of the solver. If this return code is given, then the solving process was successful, and the left solution was given. @@ -279,12 +279,12 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = true + - `successful_retcode` = `true` """ ExactSolutionLeft """ - `ReturnCode.ExactSolutionRight` + ReturnCode.ExactSolutionRight The success state of the solver. If this return code is given, then the solving process was successful, and the right solution was given. @@ -298,12 +298,12 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = true + - `successful_retcode` = `true` """ ExactSolutionRight """ - `ReturnCode.FloatingPointLimit` + ReturnCode.FloatingPointLimit The success state of the solver. If this return code is given, then the solving process was successful, and the closest floating point value to the solution was given. @@ -317,23 +317,23 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = true + - `successful_retcode` = `true` """ FloatingPointLimit """ - `ReturnCode.Infeasible` + ReturnCode.Infeasible The optimization problem was proven to be infeasible by the solver. ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ Infeasible """ - `ReturnCode.MaxTime` + ReturnCode.MaxTime A failure exit state of the solver. If this return code is given, then the solving process was unsuccessful and exited early because the solver's @@ -342,34 +342,34 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ MaxTime """ - `ReturnCode.InternalLineSearchFailed` + ReturnCode.InternalLineSearchFailed Internal Line Search used by the algorithm has failed. ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ InternalLineSearchFailed """ - `ReturnCode.ShrinkThresholdExceeded` + ReturnCode.ShrinkThresholdExceeded The trust region radius was shrunk more times than the provided threshold. ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ ShrinkThresholdExceeded """ - `ReturnCode.Stalled` + ReturnCode.Stalled The solution has stalled. This is only returned by algorithms for which stalling is a failure mode. Certain solvers like Nonlinear Least Squares solvers are considered @@ -377,12 +377,12 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ Stalled """ - `ReturnCode.InternalLinearSolveFailed` + ReturnCode.InternalLinearSolveFailed The linear problem inside another problem (for example inside a NonlinearProblem) could not be solved. @@ -394,7 +394,7 @@ EnumX.@enumx ReturnCode begin ## Properties - - successful_retcode = false + - `successful_retcode` = `false` """ InternalLinearSolveFailed end @@ -477,8 +477,8 @@ function Base.convert(::Type{ReturnCode.T}, bool::Bool) end """ -`successful_retcode(retcode::ReturnCode.T)::Bool` -`successful_retcode(sol::AbstractSciMLSolution)::Bool` + successful_retcode(retcode::ReturnCode.T)::Bool + successful_retcode(sol::AbstractSciMLSolution)::Bool Returns a boolean for whether a return code should be interpreted as a form of success. """ diff --git a/src/scimlfunctions.jl b/src/scimlfunctions.jl index becff1bf9..a89e2c9d6 100644 --- a/src/scimlfunctions.jl +++ b/src/scimlfunctions.jl @@ -70,7 +70,7 @@ Cases where automatic wrapping is disabled are equivalent to `FullSpecialize`. ## Example -``` +```julia f(du,u,p,t) = (du .= u) # Note this is the same as ODEProblem(f, [1.0], (0.0,1.0)) @@ -93,7 +93,7 @@ time. Unlike `AutoSpecialize`, `NoSpecialize` can be used with any ## Example -``` +```julia f(du,u,p,t) = (du .= u) ODEProblem{true, SciMLBase.NoSpecialize}(f, [1.0], (0.0,1.0)) ``` @@ -133,7 +133,7 @@ but also includes the limitations: ## Example -``` +```julia f(du,u,p,t) = (du .= u) ODEProblem{true, SciMLBase.FunctionWrapperSpecialize}(f, [1.0], (0.0,1.0)) ``` @@ -154,7 +154,7 @@ is required, such as in long-running simulations and benchmarking. ## Example -``` +```julia f(du,u,p,t) = (du .= u) ODEProblem{true, SciMLBase.FullSpecialize}(f, [1.0], (0.0,1.0)) ```