-
-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Default linear solver fails with ForwardDiff for bigger ODE on Ubuntu #389
Comments
Can you share that full stack trace? |
Can you try #390 ? |
Hotfix patch with #390 coming out. That should fix it. Sad to see PETab is hitting a such a slow way to do this though, it would be good to optimize that. |
#390 fixes the problem thanks! Is MKL a non-native Julia solver, for which we cannot use ForwardDiff (so that for these models we must resort to slower linear-solvers)? |
All BLAS cannot use Dual numbers, so it'll be a slower fallback. But it's all somewhat unnecessary since you can just split and LU, which is an optimization not done right now but would take like 2 hours if I find the time. |
So you mean to basically split the If there is any good example of a similar case (e.g convenience functions for effectively extract partials), and you do not have the time I will probably have time to give it a shot next week, as it is needed (currently for models of this size PEtab does not have a clear advantage over software as AMICI) |
At size 25 states? I'd have to see all of what you're doing but this would be one optimization to do. Though doing it at the nonlinear solver level might be better unless you're using Rodas5P (which I presume is the case?) |
Do you have an MWE to share? |
To clarify, for stiff models with 25-35 ODE:s and 40-75 parameters in the ODE runtime wise for parameter estimation we do not have a clear advantage to AMICI. In Julia we use the QNDF solver for these models, and compute the gradient via ForwardDiff. The reason we do not have a clear advantage is because there is not to much difference in gradient computations time against AMICI (and gradients are dominating the runtime for parameter estimation). A bit unsure which kind of MVE you would like to have? |
Currently PEtab.jl fails (crashes) on the gradient test for one of the bigger ODE models (25 states and 39 parameters), when I compute the gradient via ForwardDiff with a stiff ODE solver (e.g
Rodas5P(),
QNDF()
). The error message is very long, but it boils down to:Everything worked 12 hours ago, so I think this is related to the release of LinearSolve.jl v2.9, specifically the choice of default solver because the computations are correct when I choose another linear solver like
Rodas5P(linsolve=RFLUFactorization())
and not the new default MKL solver.Lastly, the error only occurs for bigger model, for models with <15 ODE:s everything is fine and the gradient is computed correctly. I also use an analytical Jacobian provided by ModelingToolkit.
I use Ubuntu (v20.04).
The text was updated successfully, but these errors were encountered: