Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate SparseMatricesCOO #389

Open
dpo opened this issue Dec 17, 2021 · 3 comments
Open

Integrate SparseMatricesCOO #389

dpo opened this issue Dec 17, 2021 · 3 comments
Assignees

Comments

@dpo
Copy link
Member

dpo commented Dec 17, 2021

No description provided.

@geoffroyleconte
Copy link
Member

Should I modify

function jprod!(
  nlp::AbstractNLPModel,
  rows::AbstractVector{<:Integer},
  cols::AbstractVector{<:Integer},
  vals::AbstractVector,
  v::AbstractVector,
  Jv::AbstractVector,
)

to something like

function jprod!(
  nlp::AbstractNLPModel,
  M::SparseMatrixCOO{T, <: Integer},
  v::AbstractVector,
  Jv::AbstractVector,
) where {T}

?
This may break a lot of code, but otherwise I would have to use the SparseMatrixCOO constructor at each product.

@dpo
Copy link
Member Author

dpo commented Dec 21, 2021

We should think of a complete integration of SparseMatricesCOO. NLPModels could return sparse Jacobians and Hessians in COO format. Then we need to update methods to update the elements of those matrices. After that, I'm not sure if jprod!(nlp, rows, cols, vals, v, Jv), etc., still make much sense.

@tmigot
Copy link
Member

tmigot commented Aug 5, 2022

Maybe a first step along this way, would be to replace the following two functions by SparseMatrixCOO functions:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants