-
-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DenseConvDims not always type stable #274
Comments
glad you found this, should be easy to fix |
Actually I don't think the constructor can be made type stable, since the type parameter is a runtime input value: julia> struct A{N}; x::Int; end
julia> A(i) = A{i}(10)
A
julia> @code_warntype A(1)
Variables
#self#::Type{A}
i::Int64
Body::A{_A} where _A
1 ─ %1 = Core.apply_type(Main.A, i)::Type{A{_A}} where _A
│ %2 = (%1)(10)::A{_A} where _A
└── return %2 @joostveenema why do you think this is related to FluxML/Flux.jl#1350? |
they are not inferred, the call just throws an error |
You are right, the code example does not make sense. I thought I had simplified the problem. Let's try again. In FluxML/Flux.jl#1350 the problem seems to be that the return type of Compare the following to snippets, in the first one the type of
vs
I am not that familiar with the Julia internals, so I might be wrong here.. |
I believe it's because of the splat. |
Forcing the output type in the channels_in(c::DenseConvDims{N,K,C_in,C_out}) where {N,K,C_in,C_out} = C_in::Int it is still able to infer return type and value in the first example, while in the second example now the type is inferred: julia> @code_warntype example(x)
Variables
#self#::Core.Compiler.Const(example, false)
W::Array{Float32,4}
cdims::DenseConvDims{2,_A,_B,_C,_D,_E,_F,_G} where _G where _F where _E where _D where _C where _B where _A
Body::Int64
1 ─ %1 = Core.tuple(1, 1, 1, 1)::Core.Compiler.Const((1, 1, 1, 1), false)
│ %2 = Main.size(W)::NTuple{4,Int64}
│ (cdims = Main.DenseConvDims(%1, %2))
│ %4 = NNlib.channels_in::Core.Compiler.Const(NNlib.channels_in, false)
│ %5 = (%4)(cdims)::Int64
└── return %5 Is this enough to fix FluxML/Flux.jl#1350? Maybe we should revisit the ConvDims design, it feels like we are abusing parameterization and putting a lot of stress on the inference engine |
Yes! This solves the crashes for me. I put the change I tested with in this PR #275. |
closed in #275 |
The following code example shows that the type of
DenseConvDims
cannot be properly inferred when it's input types are all NTuples. When changing one of the inputs to an array it can for some reason be correctly inferred.This seems to be the source of FluxML/Flux.jl#1350. Maybe linked to #125?
The text was updated successfully, but these errors were encountered: