Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use rmul! instead of broadcast for kscal! #926

Merged
merged 1 commit into from
Nov 4, 2024

Conversation

amontoison
Copy link
Member

@amontoison amontoison commented Nov 4, 2024

rmul! will dispatch to BLAS routines on GPUs:

It will be also faster for CPU types, the code is

function rmul!(X::AbstractArray, s::Number)
    @simd for I in eachindex(X)
        @inbounds X[I] *= s
    end
    X
end

in the source code of Julia (julia/stdlib/v1.11/LinearAlgebra/src/generic.jl).

@amontoison amontoison merged commit 8cdcf6c into JuliaSmoothOptimizers:main Nov 4, 2024
29 of 30 checks passed
@amontoison amontoison deleted the rmul branch November 4, 2024 18:16
Copy link
Contributor

github-actions bot commented Nov 4, 2024

Package name latest stable
CaNNOLeS.jl
DCISolver.jl
FletcherPenaltySolver.jl
JSOSolvers.jl
LLSModels.jl
LinearSolve.jl
Percival.jl
RipQP.jl

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant