Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add docs of AutoSparseForwardDiff and matrix coloring #231

Merged
merged 2 commits into from
Oct 9, 2023

Conversation

ErikQQY
Copy link
Member

@ErikQQY ErikQQY commented Oct 8, 2023

Continue #213

@codecov
Copy link

codecov bot commented Oct 8, 2023

Codecov Report

Merging #231 (af7654c) into master (143a966) will increase coverage by 1.03%.
The diff coverage is n/a.

@@            Coverage Diff             @@
##           master     #231      +/-   ##
==========================================
+ Coverage   19.84%   20.87%   +1.03%     
==========================================
  Files           8        8              
  Lines         776      776              
==========================================
+ Hits          154      162       +8     
+ Misses        622      614       -8     

see 2 files with indirect coverage changes

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

Comment on lines +294 to +295
colorvec = ArrayInterface.matrix_colors(jac_sparsity)
ff = NonlinearFunction(brusselator_2d_loop; jac_prototype=float.(jac_sparsity), colorvec)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Aren't the color vectors done automatically so this step is unneccessary?

Comment on lines +294 to +295
colorvec = ArrayInterface.matrix_colors(jac_sparsity)
ff = NonlinearFunction(brusselator_2d_loop; jac_prototype=float.(jac_sparsity), colorvec)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Aren't the color vectors done automatically so this step is unneccessary?

@@ -275,3 +275,28 @@ nothing # hide

For more information on the preconditioner interface, see the
[linear solver documentation](https://docs.sciml.ai/LinearSolve/stable/basics/Preconditioners/).

## Speed up Jacobian computation with sparsity exploitation and matrix coloring
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems a bit oddly placed since the parts above already did sparsity and matrix coloring. This is just a less manual route for it

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Avik said the above part https://docs.sciml.ai/NonlinearSolve/dev/tutorials/advanced/#Declaring-a-Sparse-Jacobian-with-Automatic-Sparsity-Detection didn't actually use sparsity, we still need to use sparse AD type for matrix coloring. Did I understand this wrong?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ErikQQY is right. Right now even if you have colorvecs and stuff but use a non-sparse AD type, we construct a dense jacobian based on how SparseDiffTools is setup. But this can be modified.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh. We should change that. If you pass a sparse matrix into jac_prototype it should always use sparse diff on it. The SparseDiffTools stuff should only override it to force sparse in the case of jac_prototype=nothing, otherwise it should respect the user's type and color on-demand (since coloring is super cheap).

@ChrisRackauckas ChrisRackauckas merged commit cc04a70 into SciML:master Oct 9, 2023
6 of 8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants