Releases: FluxML/NNlib.jl
Releases · FluxML/NNlib.jl
v0.8.7
NNlib v0.8.7
Closed issues:
- v0.8.6 contains breaking changes (#412)
Merged pull requests:
- partly revert changes to stay compatible w NNlibCUDA (#414) (@maxfreu)
- Update CompatHelper.yml (#419) (@CarloLucibello)
- CompatHelper: bump compat for Compat to 4, (keep existing compat) (#420) (@github-actions[bot])
v0.8.6
NNlib v0.8.6
Merged pull requests:
v0.8.5
v0.8.4
NNlib v0.8.4
Closed issues:
- Bug report:
leakyrelu'.(CuArray(rand(Float32, 10)))
fails[email protected]
(@0.8.2
works) (#398) - NNlib 0.8.3 broke NNlibCUDA (#400)
Merged pull requests:
v0.8.3
NNlib v0.8.3
Closed issues:
- Nested AD not defined for conv; any plan? (#119)
- Latest release (0.7.9) broke softmax gradients with Tracker (#251)
ForwardDiff
andConv
: Slow fallback implementation invoked for conv! (#349)rrule(::typeof(∇conv_filter)
(#362)- move the exports to the main source file (#394)
Merged pull requests:
- Silence some warnings (#383) (@mcabbott)
- Fix gradient of convolution for complex values (#389) (@zsoerenm)
- Conjugate weight before multiplying it with input (#390) (@zsoerenm)
- Improve some activation function gradients (#392) (@mcabbott)
- Simplify
softmax
, test second derivatives (#393) (@mcabbott) - Change libblas to libblastrampoline (#396) (@theabhirath)
- Move exports to main source file (#397) (@vincentmolin)
v0.8.2
v0.8.1
NNlib v0.8.1
Merged pull requests:
- Fix CI badge (#382) (@darsnack)
- trigger op specialization in scatter (#384) (@CarloLucibello)
v0.8.0
NNlib v0.8.0
Closed issues:
Merged pull requests:
- Fix convolution & pooling type-stability (#370) (@pxl-th)
- Informative error message for softmax variants (#378) (@theabhirath)
- Make ACTIVATIONS vector constant. (#379) (@eliascarv)
- Add downstream reverse CI (#380) (@ToucheSir)
- Add codecov token secret (#381) (@ToucheSir)