You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello! I am opening this issue for further discussions on docstrings, doctests, making internal API visible (by prepending _), and updating the manual. Before I open a PR, I think it would be best to discuss the documentation here. I hope this is okay as I couldn't find a discussions tab in Flux's repository.
onhot.jl
onehot.jl misses doctests for the structs returned by onehotbatch and onehot. I am not sure if these structs are supposed to be user-facing? The docstring of the struct Embedding uses OneHotMatrix directly, but other than that, I couldn't find any other use cases. Should I add some doctests to these structs? I can also add them to the manual if they are supposed to be user-facing.
Additionally, right now if a user tries to access the documentation of OneHotVector and OneHotMatrix, they are presented with the documentation (with the signature type) of OneHotArray. After reading Flux's code, the reason becomes obvious, but I don't think every user will want to dive into Flux's code. Should these structs have their independent docstrings?
utils.jl
utils.jl has a few functions with no doctests, but it isn't clear if they are user-facing or not. From my previous PRs I have learned that the internal API doesn't need doctests, so should I prepend a _ to these functions, or should I add doctests to them (if they are user-facing)? The functions -
Not present in the manual -
rng_from_array
ones32
zeros32
create_bias
Present in the manual -
rand32
randn32
throttle
functor.jl
The exact same case as that of utils.jl -
Present in the manual -
testmode!
trainmode!
f32
f64
Additionally, I think cpu and gpu are user-facing, but their APIs aren't documented in the manual.
optimisers.jl
Currently, none of the optimisers have doctests, and I am not sure how to add them without excessive repetition. For example, I came up with the following doctest for the Descent optimiser -
julia> f(x) = 4x + 2;
julia> x = hcat(0:5...); y = f.(x);
julia> model = Dense(1 => 1);
julia> opt = Descent(0.01);
julia> loss(x, y) = Flux.Losses.mse(model(x), y);
julia> ps = Flux.params(model) # initialised parameters
Params([Float32[-1.1358223;;], Float32[0.0]])
julia> loss(x, y) # loss with initial parameters
297.14438f0
julia> gs = gradient(ps) do
loss(x, y)
end;
julia> Flux.Optimise.update!(opt, ps, gs) # the parameters will be updated, and the loss should ideally go down
julia> ps
Params([Float32[-0.09425485;;], Float32[0.29679117]])
julia> loss(x, y)
191.4279f0
This does look a bit extra, but I am not sure how else I can document it properly. My main concern is that this would have to be repeated for all the optimisers, if it is added. A workaround can be initializing the same variables only once -
""" Descent(η = 0.1)...\```jldoctest optimisersjulia> f(x) = 4x + 2;julia> x = hcat(0:5...); y = f.(x);julia> model = Dense(1 => 1);julia> opt = Descent(0.01);julia> loss(x, y) = Flux.Losses.mse(model(x), y);julia> ps = Flux.params(model) # initialised parametersParams([Float32[-1.1358223;;], Float32[0.0]])julia> loss(x, y) # loss with initial parameters297.14438f0julia> gs = gradient(ps) do loss(x, y) end;julia> Flux.Optimise.update!(opt, ps, gs) # the parameters will be updated, and the loss should ideally go downjulia> psParams([Float32[-0.09425485;;], Float32[0.29679117]])julia> loss(x, y)191.4279f0\```"""mutable struct Descent <:AbstractOptimiser...""" Momentum(η = 0.01, ρ = 0.9)...\```jldoctest optimisersjulia> opt = Momentum(0.03, 0.99);julia> ps = Flux.params(model)Params([Float32[-0.09425485;;], Float32[0.29679117]])julia> loss(x, y)191.4279f0julia> gs = gradient(ps) do loss(x, y) end;julia> Flux.Optimise.update!(opt, ps, gs) # the parameters will be updatedjulia> ps # updated parametersParams([Float32[2.4130664;;], Float32[1.013122]])julia> loss(x, y)31.88943f0\```"""mutable struct Momentum <:AbstractOptimiser...
I am not sure if I should proceed ahead with this.
Suggestions from everyone are welcome :)
The text was updated successfully, but these errors were encountered:
utils.jl : ones32, rand32 etc. have docstrings, I'm not sure it matters too much whether these have examples, they should probably be in the manual though. rng_from_array and create_bias are internal.
functors.jl : cpu & gpu should surely be in the manual.
Thank you for the suggestion! I will create a PR for updating the docstrings. I will leave Optimise.jl and onehot.jl as it is for now. I'll definitely pick up some documentation work in the new packages!
Hello! I am opening this issue for further discussions on docstrings, doctests, making internal API visible (by prepending
_
), and updating the manual. Before I open a PR, I think it would be best to discuss the documentation here. I hope this is okay as I couldn't find adiscussions
tab inFlux
's repository.onhot.jl
onehot.jl
misses doctests for the structs returned byonehotbatch
andonehot
. I am not sure if these structs are supposed to be user-facing? The docstring of the structEmbedding
usesOneHotMatrix
directly, but other than that, I couldn't find any other use cases. Should I add some doctests to these structs? I can also add them to the manual if they are supposed to be user-facing.Additionally, right now if a user tries to access the documentation of
OneHotVector
andOneHotMatrix
, they are presented with the documentation (with the signature type) ofOneHotArray
. After reading Flux's code, the reason becomes obvious, but I don't think every user will want to dive into Flux's code. Should these structs have their independent docstrings?utils.jl
utils.jl
has a few functions with no doctests, but it isn't clear if they are user-facing or not. From my previous PRs I have learned that the internal API doesn't need doctests, so should I prepend a_
to these functions, or should I add doctests to them (if they are user-facing)? The functions -Not present in the manual -
rng_from_array
ones32
zeros32
create_bias
Present in the manual -
rand32
randn32
throttle
functor.jl
The exact same case as that of
utils.jl
-Present in the manual -
testmode!
trainmode!
f32
f64
Additionally, I think
cpu
andgpu
are user-facing, but their APIs aren't documented in the manual.optimisers.jl
Currently, none of the optimisers have doctests, and I am not sure how to add them without excessive repetition. For example, I came up with the following doctest for the
Descent
optimiser -This does look a bit extra, but I am not sure how else I can document it properly. My main concern is that this would have to be repeated for all the optimisers, if it is added. A workaround can be initializing the same variables only once -
I am not sure if I should proceed ahead with this.
Suggestions from everyone are welcome :)
The text was updated successfully, but these errors were encountered: