We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The issue has been reported in FluxML/Flux.jl#2513 and is due to these specializations https://github.com/LuxDL/MLDataDevices.jl/blob/17419d27888e3a48b52f318249a0f037524f0f1e/src/public.jl#L341 The simplest solution would be to remove the specializations and let fmap handle everything.
fmap
Possible optimized implementations could be upstreamed to Functors.jl, but we really want to keep track of object identity here.
The text was updated successfully, but these errors were encountered:
Removing those makes hot loops completely type-unstable for common types
Sorry, something went wrong.
Why the repo has been archived?
With FluxML/Functors.jl#82 fmap is type stable for all types
Let's do it together with updating functors to v0.5.
Successfully merging a pull request may close this issue.
The issue has been reported in
FluxML/Flux.jl#2513
and is due to these specializations
https://github.com/LuxDL/MLDataDevices.jl/blob/17419d27888e3a48b52f318249a0f037524f0f1e/src/public.jl#L341
The simplest solution would be to remove the specializations and let
fmap
handle everything.Possible optimized implementations could be upstreamed to Functors.jl, but we really want to keep track of object identity here.
The text was updated successfully, but these errors were encountered: