Skip to content

Commit

Permalink
doc changes re at-functor and at-layer (FluxML#2390)
Browse files Browse the repository at this point in the history
* doc changes re at-functor and at-layer

* fix a doctest

* more fixes

* public at-layer

* add a sentence comparing to freeze/thaw

* Apply suggestions from code review

Co-authored-by: Kyle Daruwalla <[email protected]>

* two fixes re SignDecay

---------

Co-authored-by: Kyle Daruwalla <[email protected]>
  • Loading branch information
2 people authored and isentropic committed Mar 13, 2024
1 parent 5618059 commit 4ce7033
Show file tree
Hide file tree
Showing 12 changed files with 25 additions and 17 deletions.
2 changes: 1 addition & 1 deletion NEWS.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ See also [github's page](https://github.com/FluxML/Flux.jl/releases) for a compl
This also adds `show` methods for pretty printing.

## v0.14.12
* New `SignDecay` optimiser, like `` WeightNorm` but for L1 norm.
* New `SignDecay` optimiser, like `WeightDecay` but for L1 norm.

## v0.14.0 (July 2023)
* Flux now requires julia v1.9 or later.
Expand Down
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "Flux"
uuid = "587475ba-b771-5e3f-ad9e-33799f191a9c"
version = "0.14.12"
version = "0.14.13"

[deps]
Adapt = "79e6a3ab-5dfb-504d-930d-738a2a938a0e"
Expand Down
2 changes: 1 addition & 1 deletion docs/src/models/advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ Join(combine, paths...) = Join(combine, paths)
```
Notice that we parameterized the type of the `paths` field. This is necessary for fast Julia code; in general, `T` might be a `Tuple` or `Vector`, but we don't need to pay attention to what it specifically is. The same goes for the `combine` field.

The next step is to use [`Functors.@layer`](@ref) to make our struct behave like a Flux layer. This is important so that calling `params` on a `Join` returns the underlying weight arrays on each path.
The next step is to use [`Flux.@layer`](@ref) to make our struct behave like a Flux layer. This is important so that calling `Flux.setup` on a `Join` maps over the underlying trainable arrays on each path.
```julia
Flux.@layer Join
```
Expand Down
2 changes: 1 addition & 1 deletion docs/src/models/basics.md
Original file line number Diff line number Diff line change
Expand Up @@ -255,7 +255,7 @@ m(5) # => 26

## Layer Helpers

There is still one problem with this `Affine` layer, that Flux does not know to look inside it. This means that [`Flux.train!`](@ref) won't see its parameters, nor will [`gpu`](@ref) be able to move them to your GPU. These features are enabled by the [`@functor`](@ref Functors.@functor) macro:
There is still one problem with this `Affine` layer, that Flux does not know to look inside it. This means that [`Flux.train!`](@ref) won't see its parameters, nor will [`gpu`](@ref) be able to move them to your GPU. These features are enabled by the [`@layer`](@ref Flux.@layer) macro:

```julia
Flux.@layer Affine
Expand Down
6 changes: 5 additions & 1 deletion docs/src/models/functors.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,11 @@

Flux models are deeply nested structures, and [Functors.jl](https://github.com/FluxML/Functors.jl) provides tools needed to explore such objects, apply functions to the parameters they contain, and re-build them.

New layers should be annotated using the `Functors.@functor` macro. This will enable [`params`](@ref Flux.params) to see the parameters inside, and [`gpu`](@ref) to move them to the GPU.
!!! compat "Flux ≤ 0.14"
All layers were previously defined with the `Functors.@functor` macro.
This still works, but it is recommended that you use the new [`Flux.@layer`](@ref Flux.@layer) macro instead.
Both allow [`Flux.setup`](@ref Flux.setup) to see the parameters inside, and [`gpu`](@ref) to move them to the GPU, but [`Flux.@layer`](@ref Flux.@layer) also overloads printing,
and offers a way to define `trainable` at the same time.

`Functors.jl` has its own [notes on basic usage](https://fluxml.ai/Functors.jl/stable/#Basic-Usage-and-Implementation) for more details. Additionally, the [Advanced Model Building and Customisation](@ref man-advanced) page covers the use cases of `Functors` in greater details.

Expand Down
2 changes: 1 addition & 1 deletion docs/src/models/layers.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ The `Dense` exemplifies several features:

* The bias vector is always initialised [`Flux.zeros32`](@ref). The keyword `bias=false` will turn this off, i.e. keeping the bias permanently zero.

* It is annotated with [`@functor`](@ref Functors.@functor), which means that [`params`](@ref Flux.params) will see the contents, and [`gpu`](@ref Flux.gpu) will move their arrays to the GPU.
* It is annotated with [`@layer`](@ref Flux.@layer), which means that [`Flux.setup`](@ref Flux.setup) will see the contents, and [`gpu`](@ref Flux.gpu) will move their arrays to the GPU.

By contrast, `Chain` itself contains no parameters, but connects other layers together.
The section on [dataflow layers](@ref man-dataflow-layers) introduces others like this.
Expand Down
4 changes: 2 additions & 2 deletions docs/src/saving.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,12 +16,12 @@ julia> struct MyModel
net
end
julia> Flux.@functor MyModel
julia> Flux.@layer MyModel
julia> MyModel() = MyModel(Chain(Dense(10, 5, relu), Dense(5, 2)));
julia> model = MyModel()
MyModel(Chain(Dense(10 => 5, relu), Dense(5 => 2)))
MyModel(Chain(Dense(10 => 5, relu), Dense(5 => 2))) # 67 parameters
julia> model_state = Flux.state(model);
Expand Down
1 change: 1 addition & 0 deletions docs/src/training/optimisers.md
Original file line number Diff line number Diff line change
Expand Up @@ -112,6 +112,7 @@ Similar to optimisers, Flux also defines some simple decays that can be used in
ExpDecay
InvDecay
WeightDecay
SignDecay
```

## Gradient Clipping
Expand Down
3 changes: 3 additions & 0 deletions docs/src/training/training.md
Original file line number Diff line number Diff line change
Expand Up @@ -384,6 +384,9 @@ Flux.thaw!(opt_state)
The earlier "implicit" equivalent was to pass to `gradient` an object referencing only
part of the model, such as `Flux.params(bimodel.layers.enc)`.

While `adjust!` and `freeze!`/`thaw!` make temporary modifications to the optimiser state,
permanently removing some fields of a new layer type from training is usually done
when defining the layer, by calling for example [`@layer`](@ref Flux.@layer)` NewLayer trainable=(weight,)`.

## Implicit or Explicit?

Expand Down
4 changes: 2 additions & 2 deletions src/Flux.jl
Original file line number Diff line number Diff line change
Expand Up @@ -34,11 +34,11 @@ export Chain, Dense, Embedding, Maxout, SkipConnection, Parallel, PairwiseFusion

@compat(public, ( # mark unexported symbols as API, on Julia 1.11
# modules
Losses,
Losses, Train,
# layers
Bilinear, Scale, dropout,
# utils
outputsize, state,
outputsize, state, create_bias, @layer,
))

include("optimise/Optimise.jl")
Expand Down
8 changes: 4 additions & 4 deletions src/functor.jl
Original file line number Diff line number Diff line change
Expand Up @@ -286,7 +286,7 @@ _paramtype(::Type{T}, x::AbstractArray{<:Complex{<:AbstractFloat}}) where {T<:Ab
f32(m)
Converts the `eltype` of model's *floating point* parameters to `Float32` (which is Flux's default).
Recurses into structs marked with [`@functor`](@ref).
Recurses into structs marked with [`@layer`](@ref Flux.@layer).
See also [`f64`](@ref) and [`f16`](@ref).
"""
Expand All @@ -296,7 +296,7 @@ f32(m) = _paramtype(Float32, m)
f64(m)
Converts the `eltype` of model's *floating point* parameters to `Float64`.
Recurses into structs marked with [`@functor`](@ref).
Recurses into structs marked with [`@layer`](@ref Flux.@layer).
See also [`f32`](@ref) and [`f16`](@ref).
"""
Expand All @@ -306,7 +306,7 @@ f64(m) = _paramtype(Float64, m)
f16(m)
Converts the `eltype` of model's *floating point* parameters to `Float16`.
Recurses into structs marked with [`@functor`](@ref).
Recurses into structs marked with [`@layer`](@ref Flux.@layer).
Support for `Float16` is limited on many CPUs. Julia may
convert to `Float32` for each operation, which is slow.
Expand All @@ -330,7 +330,7 @@ Chain(
"""
f16(m) = _paramtype(Float16, m)

# Functors for certain Julia data structures
# Functors for certain Julia data structures -- PIRACY, should move to Functors.jl
@functor Cholesky
trainable(c::Cholesky) = ()

Expand Down
6 changes: 3 additions & 3 deletions src/layers/macro.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,12 @@
This macro replaces most uses of `@functor`. Its basic purpose is the same:
When you define a new layer, this tells Flux to explore inside it
to see the parameters it trains, and also to move them to the GPU, change precision, etc.
Like `@functor`, this assumes your struct has the default constructor, to enable re-building.
If you define an inner constructor (i.e. a function within the `struct` block) things may break.
The keyword `trainable` allows you to limit this exploration, instead of visiting all `fieldnames(T)`.
Note that it is never necessary to tell Flux to ignore non-array objects such as functions or sizes.
* If some fields look like parameters but should not be trained,
then `trainable` lets you specify which fields to include, while the rest are ignored.
The macro also handles overloads of `show` for pretty printing.
* By default, it adds methods to 3-arg `Base.show` to treat your layer much like `Dense` or `Conv`.
Expand All @@ -21,7 +21,7 @@ The macro also handles overloads of `show` for pretty printing.
(You probably still want to define 2-arg `show(io::IO, x::Layer)`, the macro does not touch this.)
Note that re-running the macro with different options may not overwrite all methods, you will need to restart.
Note that re-running the macro with different options may not remove all methods, you will need to restart.
# Example
```jldoctest
Expand Down

0 comments on commit 4ce7033

Please sign in to comment.