Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Default linear solver fails with ForwardDiff for bigger ODE on Ubuntu #389

Closed
sebapersson opened this issue Oct 6, 2023 · 9 comments
Closed

Comments

@sebapersson
Copy link

Currently PEtab.jl fails (crashes) on the gradient test for one of the bigger ODE models (25 states and 39 parameters), when I compute the gradient via ForwardDiff with a stiff ODE solver (e.g Rodas5P(), QNDF()). The error message is very long, but it boils down to:

LoadError: MethodError: no method matching getrf!(::Matrix{ForwardDiff.Dual{ForwardDiff.Tag{PEtab.var"#430#444...
Closest candidates are:
    getrf!(::AbstractMatrix{<:Float64}; ipiv, info, check)
     @ LinearSolve ~/.julia/packages/LinearSolve/R5I9Z/src/mkl.jl:11
    getrf!(::AbstractMatrix{<:Float32}; ipiv, info, check)
     @ LinearSolve ~/.julia/packages/LinearSolve/R5I9Z/src/mkl.jl:31

Everything worked 12 hours ago, so I think this is related to the release of LinearSolve.jl v2.9, specifically the choice of default solver because the computations are correct when I choose another linear solver like Rodas5P(linsolve=RFLUFactorization()) and not the new default MKL solver.

Lastly, the error only occurs for bigger model, for models with <15 ODE:s everything is fine and the gradient is computed correctly. I also use an analytical Jacobian provided by ModelingToolkit.

I use Ubuntu (v20.04).

@sebapersson sebapersson changed the title Default linear solver fails with ForwardDiff fails for bigger ODE on Ubuntu Default linear solver fails with ForwardDiff for bigger ODE on Ubuntu Oct 6, 2023
@ChrisRackauckas
Copy link
Member

Can you share that full stack trace?

@ChrisRackauckas
Copy link
Member

Can you try #390 ?

@ChrisRackauckas
Copy link
Member

Hotfix patch with #390 coming out. That should fix it. Sad to see PETab is hitting a such a slow way to do this though, it would be good to optimize that.

@sebapersson
Copy link
Author

#390 fixes the problem thanks!

Is MKL a non-native Julia solver, for which we cannot use ForwardDiff (so that for these models we must resort to slower linear-solvers)?

@ChrisRackauckas
Copy link
Member

All BLAS cannot use Dual numbers, so it'll be a slower fallback. But it's all somewhat unnecessary since you can just split and LU, which is an optimization not done right now but would take like 2 hours if I find the time.

@sebapersson
Copy link
Author

So you mean to basically split the Matrix{Dual} into the value and partials, then perform LU for the value and loop over the partials and for each basis perform LU, and then put everything together into an output Matrix{Dual}?

If there is any good example of a similar case (e.g convenience functions for effectively extract partials), and you do not have the time I will probably have time to give it a shot next week, as it is needed (currently for models of this size PEtab does not have a clear advantage over software as AMICI)

@ChrisRackauckas
Copy link
Member

At size 25 states? I'd have to see all of what you're doing but this would be one optimization to do. Though doing it at the nonlinear solver level might be better unless you're using Rodas5P (which I presume is the case?)

@ChrisRackauckas
Copy link
Member

Do you have an MWE to share?

@sebapersson
Copy link
Author

To clarify, for stiff models with 25-35 ODE:s and 40-75 parameters in the ODE runtime wise for parameter estimation we do not have a clear advantage to AMICI. In Julia we use the QNDF solver for these models, and compute the gradient via ForwardDiff. The reason we do not have a clear advantage is because there is not to much difference in gradient computations time against AMICI (and gradients are dominating the runtime for parameter estimation).

A bit unsure which kind of MVE you would like to have?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants