From 5e982882830ea839cf40400675d613a68d21d0e6 Mon Sep 17 00:00:00 2001 From: "Documenter.jl" Date: Mon, 11 Mar 2024 08:56:47 +0000 Subject: [PATCH] build based on cbb2733 --- dev/.documenter-siteinfo.json | 2 +- dev/user/api/index.html | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/dev/.documenter-siteinfo.json b/dev/.documenter-siteinfo.json index 4139b7ff..4250d637 100644 --- a/dev/.documenter-siteinfo.json +++ b/dev/.documenter-siteinfo.json @@ -1 +1 @@ -{"documenter":{"julia_version":"1.10.2","generation_timestamp":"2024-03-11T08:56:25","documenter_version":"1.3.0"}} \ No newline at end of file +{"documenter":{"julia_version":"1.10.2","generation_timestamp":"2024-03-11T08:56:42","documenter_version":"1.3.0"}} \ No newline at end of file diff --git a/dev/user/api/index.html b/dev/user/api/index.html index e47fc57f..033a4113 100644 --- a/dev/user/api/index.html +++ b/dev/user/api/index.html @@ -1,2 +1,2 @@ -Differentiation API · ForwardDiff

Differentiation API

Derivatives of f(x::Real)::Union{Real,AbstractArray}

ForwardDiff.derivativeFunction
ForwardDiff.derivative(f, x::Real)

Return df/dx evaluated at x, assuming f is called as f(x).

This method assumes that isa(f(x), Union{Real,AbstractArray}).

source
ForwardDiff.derivative(f!, y::AbstractArray, x::Real, cfg::DerivativeConfig = DerivativeConfig(f!, y, x), check=Val{true}())

Return df!/dx evaluated at x, assuming f! is called as f!(y, x) where the result is stored in y.

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.derivative!Function
ForwardDiff.derivative!(result::Union{AbstractArray,DiffResult}, f, x::Real)

Compute df/dx evaluated at x and store the result(s) in result, assuming f is called as f(x).

This method assumes that isa(f(x), Union{Real,AbstractArray}).

source
ForwardDiff.derivative!(result::Union{AbstractArray,DiffResult}, f!, y::AbstractArray, x::Real, cfg::DerivativeConfig = DerivativeConfig(f!, y, x), check=Val{true}())

Compute df!/dx evaluated at x and store the result(s) in result, assuming f! is called as f!(y, x) where the result is stored in y.

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source

Gradients of f(x::AbstractArray)::Real

ForwardDiff.gradientFunction
ForwardDiff.gradient(f, x::AbstractArray, cfg::GradientConfig = GradientConfig(f, x), check=Val{true}())

Return ∇f evaluated at x, assuming f is called as f(x). The array ∇f has the same shape as x, and its elements are ∇f[j, k, ...] = ∂f/∂x[j, k, ...].

This method assumes that isa(f(x), Real).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.gradient!Function
ForwardDiff.gradient!(result::Union{AbstractArray,DiffResult}, f, x::AbstractArray, cfg::GradientConfig = GradientConfig(f, x), check=Val{true}())

Compute ∇f evaluated at x and store the result(s) in result, assuming f is called as f(x).

This method assumes that isa(f(x), Real).

source

Jacobians of f(x::AbstractArray)::AbstractArray

ForwardDiff.jacobianFunction
ForwardDiff.jacobian(f, x::AbstractArray, cfg::JacobianConfig = JacobianConfig(f, x), check=Val{true}())

Return J(f) evaluated at x, assuming f is called as f(x). Multidimensional arrays are flattened in iteration order: the array J(f) has shape length(f(x)) × length(x), and its elements are J(f)[j,k] = ∂f(x)[j]/∂x[k]. When x is a vector, this means that jacobian(x->[f(x)], x) is the transpose of gradient(f, x).

This method assumes that isa(f(x), AbstractArray).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.jacobian(f!, y::AbstractArray, x::AbstractArray, cfg::JacobianConfig = JacobianConfig(f!, y, x), check=Val{true}())

Return J(f!) evaluated at x, assuming f! is called as f!(y, x) where the result is stored in y.

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.jacobian!Function
ForwardDiff.jacobian!(result::Union{AbstractArray,DiffResult}, f, x::AbstractArray, cfg::JacobianConfig = JacobianConfig(f, x), check=Val{true}())

Compute J(f) evaluated at x and store the result(s) in result, assuming f is called as f(x).

This method assumes that isa(f(x), AbstractArray).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.jacobian!(result::Union{AbstractArray,DiffResult}, f!, y::AbstractArray, x::AbstractArray, cfg::JacobianConfig = JacobianConfig(f!, y, x), check=Val{true}())

Compute J(f!) evaluated at x and store the result(s) in result, assuming f! is called as f!(y, x) where the result is stored in y.

This method assumes that isa(f(x), AbstractArray).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source

Hessians of f(x::AbstractArray)::Real

ForwardDiff.hessianFunction
ForwardDiff.hessian(f, x::AbstractArray, cfg::HessianConfig = HessianConfig(f, x), check=Val{true}())

Return H(f) (i.e. J(∇(f))) evaluated at x, assuming f is called as f(x).

This method assumes that isa(f(x), Real).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.hessian!Function
ForwardDiff.hessian!(result::AbstractArray, f, x::AbstractArray, cfg::HessianConfig = HessianConfig(f, x), check=Val{true}())

Compute H(f) (i.e. J(∇(f))) evaluated at x and store the result(s) in result, assuming f is called as f(x).

This method assumes that isa(f(x), Real).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.hessian!(result::DiffResult, f, x::AbstractArray, cfg::HessianConfig = HessianConfig(f, result, x), check=Val{true}())

Exactly like ForwardDiff.hessian!(result::AbstractArray, f, x::AbstractArray, cfg::HessianConfig), but because isa(result, DiffResult), cfg is constructed as HessianConfig(f, result, x) instead of HessianConfig(f, x).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source

Preallocating/Configuring Work Buffers

For the sake of convenience and performance, all "extra" information used by ForwardDiff's API methods is bundled up in the ForwardDiff.AbstractConfig family of types. These types allow the user to easily feed several different parameters to ForwardDiff's API methods, such as chunk size, work buffers, and perturbation seed configurations.

ForwardDiff's basic API methods will allocate these types automatically by default, but you can drastically reduce memory usage if you preallocate them yourself.

Note that for all constructors below, the chunk size N may be explicitly provided, or omitted, in which case ForwardDiff will automatically select a chunk size for you. However, it is highly recommended to specify the chunk size manually when possible (see Configuring Chunk Size).

Note also that configurations constructed for a specific function f cannot be reused to differentiate other functions (though can be reused to differentiate f at different values). To construct a configuration which can be reused to differentiate any function, you can pass nothing as the function argument. While this is more flexible, it decreases ForwardDiff's ability to catch and prevent perturbation confusion.

ForwardDiff.DerivativeConfigType
ForwardDiff.DerivativeConfig(f!, y::AbstractArray, x::AbstractArray)

Return a DerivativeConfig instance based on the type of f!, and the types/shapes of the output vector y and the input vector x.

The returned DerivativeConfig instance contains all the work buffers required by ForwardDiff.derivative and ForwardDiff.derivative! when the target function takes the form f!(y, x).

If f! is nothing instead of the actual target function, then the returned instance can be used with any target function. However, this will reduce ForwardDiff's ability to catch and prevent perturbation confusion (see https://github.com/JuliaDiff/ForwardDiff.jl/issues/83).

This constructor does not store/modify y or x.

source
ForwardDiff.GradientConfigType
ForwardDiff.GradientConfig(f, x::AbstractArray, chunk::Chunk = Chunk(x))

Return a GradientConfig instance based on the type of f and type/shape of the input vector x.

The returned GradientConfig instance contains all the work buffers required by ForwardDiff.gradient and ForwardDiff.gradient!.

If f is nothing instead of the actual target function, then the returned instance can be used with any target function. However, this will reduce ForwardDiff's ability to catch and prevent perturbation confusion (see https://github.com/JuliaDiff/ForwardDiff.jl/issues/83).

This constructor does not store/modify x.

source
ForwardDiff.JacobianConfigType
ForwardDiff.JacobianConfig(f, x::AbstractArray, chunk::Chunk = Chunk(x))

Return a JacobianConfig instance based on the type of f and type/shape of the input vector x.

The returned JacobianConfig instance contains all the work buffers required by ForwardDiff.jacobian and ForwardDiff.jacobian! when the target function takes the form f(x).

If f is nothing instead of the actual target function, then the returned instance can be used with any target function. However, this will reduce ForwardDiff's ability to catch and prevent perturbation confusion (see https://github.com/JuliaDiff/ForwardDiff.jl/issues/83).

This constructor does not store/modify x.

source
ForwardDiff.JacobianConfig(f!, y::AbstractArray, x::AbstractArray, chunk::Chunk = Chunk(x))

Return a JacobianConfig instance based on the type of f!, and the types/shapes of the output vector y and the input vector x.

The returned JacobianConfig instance contains all the work buffers required by ForwardDiff.jacobian and ForwardDiff.jacobian! when the target function takes the form f!(y, x).

If f! is nothing instead of the actual target function, then the returned instance can be used with any target function. However, this will reduce ForwardDiff's ability to catch and prevent perturbation confusion (see https://github.com/JuliaDiff/ForwardDiff.jl/issues/83).

This constructor does not store/modify y or x.

source
ForwardDiff.HessianConfigType
ForwardDiff.HessianConfig(f, x::AbstractArray, chunk::Chunk = Chunk(x))

Return a HessianConfig instance based on the type of f and type/shape of the input vector x.

The returned HessianConfig instance contains all the work buffers required by ForwardDiff.hessian and ForwardDiff.hessian!. For the latter, the buffers are configured for the case where the result argument is an AbstractArray. If it is a DiffResult, the HessianConfig should instead be constructed via ForwardDiff.HessianConfig(f, result, x, chunk).

If f is nothing instead of the actual target function, then the returned instance can be used with any target function. However, this will reduce ForwardDiff's ability to catch and prevent perturbation confusion (see https://github.com/JuliaDiff/ForwardDiff.jl/issues/83).

This constructor does not store/modify x.

source
ForwardDiff.HessianConfig(f, result::DiffResult, x::AbstractArray, chunk::Chunk = Chunk(x))

Return a HessianConfig instance based on the type of f, types/storage in result, and type/shape of the input vector x.

The returned HessianConfig instance contains all the work buffers required by ForwardDiff.hessian! for the case where the result argument is an DiffResult.

If f is nothing instead of the actual target function, then the returned instance can be used with any target function. However, this will reduce ForwardDiff's ability to catch and prevent perturbation confusion (see https://github.com/JuliaDiff/ForwardDiff.jl/issues/83).

This constructor does not store/modify x.

source
+Differentiation API · ForwardDiff

Differentiation API

Derivatives of f(x::Real)::Union{Real,AbstractArray}

ForwardDiff.derivativeFunction
ForwardDiff.derivative(f, x::Real)

Return df/dx evaluated at x, assuming f is called as f(x).

This method assumes that isa(f(x), Union{Real,AbstractArray}).

source
ForwardDiff.derivative(f!, y::AbstractArray, x::Real, cfg::DerivativeConfig = DerivativeConfig(f!, y, x), check=Val{true}())

Return df!/dx evaluated at x, assuming f! is called as f!(y, x) where the result is stored in y.

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.derivative!Function
ForwardDiff.derivative!(result::Union{AbstractArray,DiffResult}, f, x::Real)

Compute df/dx evaluated at x and store the result(s) in result, assuming f is called as f(x).

This method assumes that isa(f(x), Union{Real,AbstractArray}).

source
ForwardDiff.derivative!(result::Union{AbstractArray,DiffResult}, f!, y::AbstractArray, x::Real, cfg::DerivativeConfig = DerivativeConfig(f!, y, x), check=Val{true}())

Compute df!/dx evaluated at x and store the result(s) in result, assuming f! is called as f!(y, x) where the result is stored in y.

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source

Gradients of f(x::AbstractArray)::Real

ForwardDiff.gradientFunction
ForwardDiff.gradient(f, x::AbstractArray, cfg::GradientConfig = GradientConfig(f, x), check=Val{true}())

Return ∇f evaluated at x, assuming f is called as f(x). The array ∇f has the same shape as x, and its elements are ∇f[j, k, ...] = ∂f/∂x[j, k, ...].

This method assumes that isa(f(x), Real).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.gradient!Function
ForwardDiff.gradient!(result::Union{AbstractArray,DiffResult}, f, x::AbstractArray, cfg::GradientConfig = GradientConfig(f, x), check=Val{true}())

Compute ∇f evaluated at x and store the result(s) in result, assuming f is called as f(x).

This method assumes that isa(f(x), Real).

source

Jacobians of f(x::AbstractArray)::AbstractArray

ForwardDiff.jacobianFunction
ForwardDiff.jacobian(f, x::AbstractArray, cfg::JacobianConfig = JacobianConfig(f, x), check=Val{true}())

Return J(f) evaluated at x, assuming f is called as f(x). Multidimensional arrays are flattened in iteration order: the array J(f) has shape length(f(x)) × length(x), and its elements are J(f)[j,k] = ∂f(x)[j]/∂x[k]. When x is a vector, this means that jacobian(x->[f(x)], x) is the transpose of gradient(f, x).

This method assumes that isa(f(x), AbstractArray).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.jacobian(f!, y::AbstractArray, x::AbstractArray, cfg::JacobianConfig = JacobianConfig(f!, y, x), check=Val{true}())

Return J(f!) evaluated at x, assuming f! is called as f!(y, x) where the result is stored in y.

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.jacobian!Function
ForwardDiff.jacobian!(result::Union{AbstractArray,DiffResult}, f, x::AbstractArray, cfg::JacobianConfig = JacobianConfig(f, x), check=Val{true}())

Compute J(f) evaluated at x and store the result(s) in result, assuming f is called as f(x).

This method assumes that isa(f(x), AbstractArray).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.jacobian!(result::Union{AbstractArray,DiffResult}, f!, y::AbstractArray, x::AbstractArray, cfg::JacobianConfig = JacobianConfig(f!, y, x), check=Val{true}())

Compute J(f!) evaluated at x and store the result(s) in result, assuming f! is called as f!(y, x) where the result is stored in y.

This method assumes that isa(f(x), AbstractArray).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source

Hessians of f(x::AbstractArray)::Real

ForwardDiff.hessianFunction
ForwardDiff.hessian(f, x::AbstractArray, cfg::HessianConfig = HessianConfig(f, x), check=Val{true}())

Return H(f) (i.e. J(∇(f))) evaluated at x, assuming f is called as f(x).

This method assumes that isa(f(x), Real).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.hessian!Function
ForwardDiff.hessian!(result::AbstractArray, f, x::AbstractArray, cfg::HessianConfig = HessianConfig(f, x), check=Val{true}())

Compute H(f) (i.e. J(∇(f))) evaluated at x and store the result(s) in result, assuming f is called as f(x).

This method assumes that isa(f(x), Real).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source
ForwardDiff.hessian!(result::DiffResult, f, x::AbstractArray, cfg::HessianConfig = HessianConfig(f, result, x), check=Val{true}())

Exactly like ForwardDiff.hessian!(result::AbstractArray, f, x::AbstractArray, cfg::HessianConfig), but because isa(result, DiffResult), cfg is constructed as HessianConfig(f, result, x) instead of HessianConfig(f, x).

Set check to Val{false}() to disable tag checking. This can lead to perturbation confusion, so should be used with care.

source

Preallocating/Configuring Work Buffers

For the sake of convenience and performance, all "extra" information used by ForwardDiff's API methods is bundled up in the ForwardDiff.AbstractConfig family of types. These types allow the user to easily feed several different parameters to ForwardDiff's API methods, such as chunk size, work buffers, and perturbation seed configurations.

ForwardDiff's basic API methods will allocate these types automatically by default, but you can drastically reduce memory usage if you preallocate them yourself.

Note that for all constructors below, the chunk size N may be explicitly provided, or omitted, in which case ForwardDiff will automatically select a chunk size for you. However, it is highly recommended to specify the chunk size manually when possible (see Configuring Chunk Size).

Note also that configurations constructed for a specific function f cannot be reused to differentiate other functions (though can be reused to differentiate f at different values). To construct a configuration which can be reused to differentiate any function, you can pass nothing as the function argument. While this is more flexible, it decreases ForwardDiff's ability to catch and prevent perturbation confusion.

ForwardDiff.DerivativeConfigType
ForwardDiff.DerivativeConfig(f!, y::AbstractArray, x::AbstractArray)

Return a DerivativeConfig instance based on the type of f!, and the types/shapes of the output vector y and the input vector x.

The returned DerivativeConfig instance contains all the work buffers required by ForwardDiff.derivative and ForwardDiff.derivative! when the target function takes the form f!(y, x).

If f! is nothing instead of the actual target function, then the returned instance can be used with any target function. However, this will reduce ForwardDiff's ability to catch and prevent perturbation confusion (see https://github.com/JuliaDiff/ForwardDiff.jl/issues/83).

This constructor does not store/modify y or x.

source
ForwardDiff.GradientConfigType
ForwardDiff.GradientConfig(f, x::AbstractArray, chunk::Chunk = Chunk(x))

Return a GradientConfig instance based on the type of f and type/shape of the input vector x.

The returned GradientConfig instance contains all the work buffers required by ForwardDiff.gradient and ForwardDiff.gradient!.

If f is nothing instead of the actual target function, then the returned instance can be used with any target function. However, this will reduce ForwardDiff's ability to catch and prevent perturbation confusion (see https://github.com/JuliaDiff/ForwardDiff.jl/issues/83).

This constructor does not store/modify x.

source
ForwardDiff.JacobianConfigType
ForwardDiff.JacobianConfig(f, x::AbstractArray, chunk::Chunk = Chunk(x))

Return a JacobianConfig instance based on the type of f and type/shape of the input vector x.

The returned JacobianConfig instance contains all the work buffers required by ForwardDiff.jacobian and ForwardDiff.jacobian! when the target function takes the form f(x).

If f is nothing instead of the actual target function, then the returned instance can be used with any target function. However, this will reduce ForwardDiff's ability to catch and prevent perturbation confusion (see https://github.com/JuliaDiff/ForwardDiff.jl/issues/83).

This constructor does not store/modify x.

source
ForwardDiff.JacobianConfig(f!, y::AbstractArray, x::AbstractArray, chunk::Chunk = Chunk(x))

Return a JacobianConfig instance based on the type of f!, and the types/shapes of the output vector y and the input vector x.

The returned JacobianConfig instance contains all the work buffers required by ForwardDiff.jacobian and ForwardDiff.jacobian! when the target function takes the form f!(y, x).

If f! is nothing instead of the actual target function, then the returned instance can be used with any target function. However, this will reduce ForwardDiff's ability to catch and prevent perturbation confusion (see https://github.com/JuliaDiff/ForwardDiff.jl/issues/83).

This constructor does not store/modify y or x.

source
ForwardDiff.HessianConfigType
ForwardDiff.HessianConfig(f, x::AbstractArray, chunk::Chunk = Chunk(x))

Return a HessianConfig instance based on the type of f and type/shape of the input vector x.

The returned HessianConfig instance contains all the work buffers required by ForwardDiff.hessian and ForwardDiff.hessian!. For the latter, the buffers are configured for the case where the result argument is an AbstractArray. If it is a DiffResult, the HessianConfig should instead be constructed via ForwardDiff.HessianConfig(f, result, x, chunk).

If f is nothing instead of the actual target function, then the returned instance can be used with any target function. However, this will reduce ForwardDiff's ability to catch and prevent perturbation confusion (see https://github.com/JuliaDiff/ForwardDiff.jl/issues/83).

This constructor does not store/modify x.

source
ForwardDiff.HessianConfig(f, result::DiffResult, x::AbstractArray, chunk::Chunk = Chunk(x))

Return a HessianConfig instance based on the type of f, types/storage in result, and type/shape of the input vector x.

The returned HessianConfig instance contains all the work buffers required by ForwardDiff.hessian! for the case where the result argument is an DiffResult.

If f is nothing instead of the actual target function, then the returned instance can be used with any target function. However, this will reduce ForwardDiff's ability to catch and prevent perturbation confusion (see https://github.com/JuliaDiff/ForwardDiff.jl/issues/83).

This constructor does not store/modify x.

source