Skip to content

Commit

Permalink
fix doctest
Browse files Browse the repository at this point in the history
  • Loading branch information
CarloLucibello committed Dec 13, 2024
1 parent 47b570c commit 89fc89f
Show file tree
Hide file tree
Showing 5 changed files with 7 additions and 5 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -75,4 +75,4 @@ jobs:
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCUMENTER_KEY: ${{ secrets.DOCUMENTER_KEY }}
DATADEPS_ALWAYS_ACCEPT: true

1 change: 1 addition & 0 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ using Documenter, Flux, NNlib, Functors, MLUtils, BSON, Optimisers,
OneHotArrays, Zygote, ChainRulesCore, Plots, MLDatasets, Statistics,
DataFrames, JLD2, MLDataDevices

ENV["DATADEPS_ALWAYS_ACCEPT"] = true

DocMeta.setdocmeta!(Flux, :DocTestSetup, :(using Flux); recursive = true)

Expand Down
4 changes: 2 additions & 2 deletions docs/src/guide/models/basics.md
Original file line number Diff line number Diff line change
Expand Up @@ -185,7 +185,7 @@ These matching nested structures are at the core of how Flux works.
<h3><img src="../../../assets/zygote-crop.png" width="40px"/>&nbsp;<a href="https://github.com/FluxML/Zygote.jl">Zygote.jl</a></h3>
```

Flux's [`gradient`](@ref) function by default calls a companion packages called [Zygote](https://github.com/FluxML/Zygote.jl).
Flux's [`gradient`](@ref Flux.gradient) function by default calls a companion packages called [Zygote](https://github.com/FluxML/Zygote.jl).
Zygote performs source-to-source automatic differentiation, meaning that `gradient(f, x)`
hooks into Julia's compiler to find out what operations `f` contains, and transforms this
to produce code for computing `∂f/∂x`.
Expand Down Expand Up @@ -372,7 +372,7 @@ How does this `model3` differ from the `model1` we had before?
Its contents is stored in a tuple, thus `model3.layers[1].weight` is an array.
* Flux's layer [`Dense`](@ref Flux.Dense) has only minor differences from our `struct Layer`:
- Like `struct Poly3{T}` above, it has type parameters for its fields -- the compiler does not know exactly what type `layer3s.W` will be, which costs speed.
- Its initialisation uses not `randn` (normal distribution) but [`glorot_uniform`](@ref) by default.
- Its initialisation uses not `randn` (normal distribution) but [`glorot_uniform`](@ref Flux.glorot_uniform) by default.
- It reshapes some inputs (to allow several batch dimensions), and produces more friendly errors on wrong-size input.
- And it has some performance tricks: making sure element types match, and re-using some memory.
* The function [`σ`](@ref NNlib.sigmoid) is calculated in a slightly better way,
Expand Down
1 change: 1 addition & 0 deletions docs/src/reference/models/layers.md
Original file line number Diff line number Diff line change
Expand Up @@ -104,6 +104,7 @@ PairwiseFusion
Much like the core layers above, but can be used to process sequence data (as well as other kinds of structured data).

```@docs
Recurrence
RNNCell
RNN
LSTMCell
Expand Down
4 changes: 2 additions & 2 deletions src/layers/recurrent.jl
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ function scan(cell, x, state)
end

"""
Recurrent(cell)
Recurrence(cell)
Create a recurrent layer that processes entire sequences out
of a recurrent `cell`, such as an [`RNNCell`](@ref), [`LSTMCell`](@ref), or [`GRUCell`](@ref),
Expand Down Expand Up @@ -52,7 +52,7 @@ stack(out, dims = 2)
# Examples
```jldoctest
julia> rnn = Recurrent(RNNCell(2 => 3))
julia> rnn = Recurrence(RNNCell(2 => 3))
julia> x = rand(Float32, 2, 3, 4); # in x len x batch_size
Expand Down

0 comments on commit 89fc89f

Please sign in to comment.