Skip to content

Commit

Permalink
more rebase problems
Browse files Browse the repository at this point in the history
Co-authored-by: Carlo Lucibello <[email protected]>
  • Loading branch information
mcabbott and CarloLucibello authored Nov 24, 2024
1 parent 4d460a4 commit 126e7bd
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 22 deletions.
21 changes: 0 additions & 21 deletions src/optimise/train.jl
Original file line number Diff line number Diff line change
@@ -1,24 +1,3 @@
using ProgressLogging: @progress, @withprogress, @logprogress
import Zygote: Params, gradient, withgradient

# Add methods to Optimisers.jl's function, so that there is just one Flux.update!
# for both explicit and implicit parameters.
import Optimisers.update!

"""
update!(opt, p, g)
update!(opt, ps::Params, gs)
Perform an update step of the parameters `ps` (or the single parameter `p`)
according to optimiser `opt::AbstractOptimiser` and the gradients `gs` (the gradient `g`).
As a result, the parameters are mutated and the optimiser's internal state may change.
The gradient could be mutated as well.
!!! compat "Deprecated"
This method for implicit `Params` (and `AbstractOptimiser`) will be removed from Flux 0.15.
The explicit method `update!(opt, model, grad)` from Optimisers.jl will remain.
"""
function update!(opt::AbstractOptimiser, x::AbstractArray, x̄)
x̄r = copyto!(similar(x̄), x̄) # Flux.Optimise assumes it can mutate the gradient. This is not
# safe due to aliasing, nor guaranteed to be possible, e.g. Fill.
Expand Down
2 changes: 1 addition & 1 deletion src/train.jl
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ using Zygote: Zygote
export setup, train!

using ProgressLogging: @progress, @withprogress, @logprogress
using Zygote: Zygote, Params
using Zygote: Zygote
using EnzymeCore: Duplicated

"""
Expand Down

0 comments on commit 126e7bd

Please sign in to comment.