-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Which package to use for ACA method? #2
Comments
Sorry I don't get your question. Here's an example to use the odesolver for integration (forward pass) and train the parameters using gradients from backward pass. https://github.com/juntang-zhuang/TorchDiffEqPack/blob/master/test_code/three_body_problem.py For naive solvers, you can use
For ACA, checkout
In my package you can use the three method above interchangeably in most cases, except they have different considerations for memory / accuracy tradeoff. Auto diff is a totally independent notion from ODE solver. ODE solver just gives the numerical solution to ODE; auto diff back-propagates an operation, in this special case the operation happens to be "solving an ODE", which is the same notion of back-prop for "Matrix multiplication". |
Thanks for the explanation. In this example, naive solver The example is running fine. But replacing
|
You can remove 'regenerate_graph' from options when using This example might not be good, PS: the "naive" method is not the true naive method ("true naive" is direct translate a numpy integrator directly into PyTorch), it's much more accurate and memory efficient that the true naive solver, where the stepsize search process is also back-propagated through. |
Very interesting work. I want to implement ACA method, should I use odesolver_mem instead?
Is the odesolver for the naive Neural ODEs using auto differentiation?
Would you have an example using it for training instead of using odesolver? It seems they have different usage.
Thanks.
The text was updated successfully, but these errors were encountered: