-
-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add docs of AutoSparseForwardDiff and matrix coloring #231
Conversation
Signed-off-by: ErikQQY <[email protected]>
Signed-off-by: ErikQQY <[email protected]>
Codecov Report
@@ Coverage Diff @@
## master #231 +/- ##
==========================================
+ Coverage 19.84% 20.87% +1.03%
==========================================
Files 8 8
Lines 776 776
==========================================
+ Hits 154 162 +8
+ Misses 622 614 -8 see 2 files with indirect coverage changes 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
colorvec = ArrayInterface.matrix_colors(jac_sparsity) | ||
ff = NonlinearFunction(brusselator_2d_loop; jac_prototype=float.(jac_sparsity), colorvec) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Aren't the color vectors done automatically so this step is unneccessary?
colorvec = ArrayInterface.matrix_colors(jac_sparsity) | ||
ff = NonlinearFunction(brusselator_2d_loop; jac_prototype=float.(jac_sparsity), colorvec) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Aren't the color vectors done automatically so this step is unneccessary?
@@ -275,3 +275,28 @@ nothing # hide | |||
|
|||
For more information on the preconditioner interface, see the | |||
[linear solver documentation](https://docs.sciml.ai/LinearSolve/stable/basics/Preconditioners/). | |||
|
|||
## Speed up Jacobian computation with sparsity exploitation and matrix coloring |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems a bit oddly placed since the parts above already did sparsity and matrix coloring. This is just a less manual route for it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Avik said the above part https://docs.sciml.ai/NonlinearSolve/dev/tutorials/advanced/#Declaring-a-Sparse-Jacobian-with-Automatic-Sparsity-Detection didn't actually use sparsity, we still need to use sparse AD type for matrix coloring. Did I understand this wrong?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ErikQQY is right. Right now even if you have colorvecs and stuff but use a non-sparse AD type, we construct a dense jacobian based on how SparseDiffTools is setup. But this can be modified.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh. We should change that. If you pass a sparse matrix into jac_prototype it should always use sparse diff on it. The SparseDiffTools stuff should only override it to force sparse in the case of jac_prototype=nothing
, otherwise it should respect the user's type and color on-demand (since coloring is super cheap).
Continue #213