Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add WeightNorm reparametrization #2550

Merged
merged 12 commits into from
Dec 13, 2024
Merged

Add WeightNorm reparametrization #2550

merged 12 commits into from
Dec 13, 2024

Conversation

pxl-th
Copy link
Member

@pxl-th pxl-th commented Dec 12, 2024

Yet another attempt at adding WeightNorm.
Based on different bits found in #2053 [edited!] & #1005 with tests and documentation.

PR Checklist

  • Tests are added
  • Entry in NEWS.md
  • Documentation, if applicable

@CarloLucibello
Copy link
Member

CarloLucibello commented Dec 12, 2024

This way we have duplication of the weights.
Can we let the original layer's weights act like v?
Something like

struct WeightNorm{which, dims, L, G}
    layer::L
    g::G
end

(w::WeightNorm)(x) = transform(w)(x)

function transform(wn::WeightNorm{which, dims}) where {which, dims}
    ϵ = eps(eltype(wn.v))
    v = getfield(wn.layer, which)     
    n2 = sum(abs2, v; dims)
    w = @. wn.g * v / sqrt(n2 + ϵ)
    fields, ctor = Functors.functor(wn.layer)
    return ctor(merge(
        fields, NamedTuple{(which,)}((w,)),
    ))
end

Copy link

codecov bot commented Dec 12, 2024

Codecov Report

Attention: Patch coverage is 76.92308% with 6 lines in your changes missing coverage. Please review.

Project coverage is 32.78%. Comparing base (2bbd8b3) to head (54319e1).
Report is 1 commits behind head on master.

Files with missing lines Patch % Lines
src/layers/normalise.jl 76.92% 6 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #2550      +/-   ##
==========================================
+ Coverage   31.95%   32.78%   +0.82%     
==========================================
  Files          34       34              
  Lines        1987     1998      +11     
==========================================
+ Hits          635      655      +20     
+ Misses       1352     1343       -9     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

src/layers/normalise.jl Outdated Show resolved Hide resolved
src/Flux.jl Show resolved Hide resolved
src/layers/normalise.jl Outdated Show resolved Hide resolved
test/layers/normalisation.jl Outdated Show resolved Hide resolved
@pxl-th
Copy link
Member Author

pxl-th commented Dec 12, 2024

Not yet ready. Needs some adjustments for GPUs

@pxl-th pxl-th marked this pull request as draft December 12, 2024 18:49
@pxl-th pxl-th marked this pull request as ready for review December 12, 2024 21:14
@pxl-th
Copy link
Member Author

pxl-th commented Dec 12, 2024

Added GPU tests as well, making a small test suite which we can exapnd in future to avoid duplication.
Should be ready for review

@CarloLucibello
Copy link
Member

The doctest needs to be fixed. besides that looks good

@pxl-th
Copy link
Member Author

pxl-th commented Dec 13, 2024

All tests now pass.

@CarloLucibello CarloLucibello merged commit 9050ef0 into master Dec 13, 2024
6 of 9 checks passed
@pxl-th pxl-th deleted the pxl-th/weightnorm branch December 13, 2024 08:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants