Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Broadcasting issue with element-wise multiplication (Hadamard product) #410

Closed
JinraeKim opened this issue Dec 23, 2020 · 2 comments
Closed

Comments

@JinraeKim
Copy link
Contributor

JinraeKim commented Dec 23, 2020

Issue

For now, Convex.jl does not support element-wise multiplication (that is, .*).
In many application cases, it is desirable that Convex.jl is compatible with such broadcasting & iterate methods.

It can be avoided by using dot(*)(x, y) (See operations table) but it's bit annoying when integrating Convex.jl with other-purpose applications.

Example codes

Here's the comparison:

  1. with y = [6, 7, 8] (that is, an array)
function test()
    Wy = rand(128, 3)
    Wyu = rand(3, 5)
    by = rand(3)
    # y = Variable(3)
    y = [6, 7, 8]
    @bp
    result = Wy * (y .* (Wyu*zeros(5) .+ by))
end

@enter test()

Result:

1|julia> result
128-element Array{Float64,1}:
...

  1. with y = Variable(3)
function test()
    Wy = rand(128, 3)
    Wyu = rand(3, 5)
    by = rand(3)
    y = Variable(3)
    # y = [6, 7, 8]
    @bp
    result = Wy * (y .* (Wyu*zeros(5) .+ by))
end

@enter test()

Result:

ERROR: MethodError: no method matching zero(::Convex.AdditionAtom)
Closest candidates are:
  zero(::Type{Pkg.Resolve.VersionWeight}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Resolve/versionweights.jl:15
  zero(::Type{Pkg.Resolve.FieldValue}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Resolve/fieldvalues.jl:38
  zero(::Type{ModelingToolkit.TermCombination}) at /home/jinrae/.julia/packages/ModelingToolkit/1qEYb/src/linearity.jl:67
  ...
Stacktrace:
 [1] generic_matvecmul!(::Array{Convex.AdditionAtom,1}, ::Char, ::Array{Float64,2}, ::Array{Convex.MultiplyAtom,1}, ::LinearAlgebra.MulAddMul{true,true,Bool,Bool}) at /home/jinrae/julia-1.5.3/share/julia/stdlib/v1.5/LinearAlgebra/src/matmul.jl:685
 [2] mul!(::Array{Convex.AdditionAtom,1}, ::Array{Float64,2}, ::Array{Convex.MultiplyAtom,1}, ::Bool, ::Bool) at /home/jinrae/julia-1.5.3/share/julia/stdlib/v1.5/LinearAlgebra/src/matmul.jl:81
 [3] mul!(::Array{Convex.AdditionAtom,1}, ::Array{Float64,2}, ::Array{Convex.MultiplyAtom,1}) at /home/jinrae/julia-1.5.3/share/julia/stdlib/v1.5/LinearAlgebra/src/matmul.jl:208
 [4] *(::Array{Float64,2}, ::Array{Convex.MultiplyAtom,1}) at /home/jinrae/julia-1.5.3/share/julia/stdlib/v1.5/LinearAlgebra/src/matmul.jl:51
 [5] test() at /home/jinrae/.julia/dev/GliderPathPlanning/test/icnn.jl:105

Note

  • You can find the background of this issue in the post in JuliaLang.
@ericphanson
Copy link
Collaborator

Thanks very much for the issue. I had forgotten this before, but a similar issue came up before: #355. I think currently we support a few broadcasted operations, but not the full machinery of broadcasting. I think part of the issue there is that it can be difficult to implement performantly with Convex.jl's current design.

@odow
Copy link
Member

odow commented Feb 23, 2022

Closing in favor of #479

@odow odow closed this as completed Feb 23, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

3 participants