-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support the MPEC formulation of the problem #10
Comments
Might be sensible to migrate some of the code to autodiff packages like https://github.com/google/jax? It seems to have most of the numpy functionalities, but still looks like an experimental package |
I've been eyeing autograd for a while now. If it works out, automatic differentiation should make the code cleaner and easier to extend. Do you know how JAX would differ for this use case? Looks like XLA imposes some constraints but also might speed stuff up. A good starting point would be replacing some of the simpler derivatives (e.g., This is relatively low on my to-do list. I think practitioners will benefit much more from other features than from MPEC. But I'm happy to talk more if you want to take a shot at replacing some functions! |
I've not used JAX/autograd (I think JAX is a juiced-up version of autograd, which seems to be a bit sketchy as a repo), but I've used things like PyTorch which provides fairly painless autograd functionality. My prior is that autograd probably works on a restricted set of numpy functions --- probably most functions have gradients implemented, while things like index assignment or other "discrete"-ish operations might be pain. Given that I'm going to see you in a week, happy to talk in person as well; very interested in contributing to this library! |
Gotcha, sounds good -- talk soon |
This depends on implementation of Hessian computation from #8.
The text was updated successfully, but these errors were encountered: