Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Don't add a differentiation algorithm to the PRIMA example #720

Closed
wants to merge 1 commit into from

Conversation

ChrisRackauckas
Copy link
Member

Fixes #719

@Vaibhavdixit02
Copy link
Member

Vaibhavdixit02 commented Mar 21, 2024

No that's not the issue. This is part of the OptinizationBase merge. I am looking at it. It will be needed by other backends as well.

@ChrisRackauckas
Copy link
Member Author

But why is the PRIMA example using autodiff in the first place? I agree that there's two issues, but this doc example shouldn't be differentiating anything.

@Vaibhavdixit02
Copy link
Member

Because in the constrained case PRIMA expects a different interface for linear and nonlinear constraints

Ω = { x ∈ ℝⁿ | xl ≤ x ≤ xu, Aₑ⋅x = bₑ, Aᵢ⋅x ≤ bᵢ, cₑ(x) = 0, and cᵢ(x) ≤ 0 }

I use autodiff to create this representation from the functional interface

@Vaibhavdixit02
Copy link
Member

Also, in general - we would never error if an AD backed is passed even for derivative free optimizer. We do throw an error when it's the other way around though, and that should get better with #715 and corresponding PR in SciMLBase. Though that's not entirely correct yet.

@ChrisRackauckas
Copy link
Member Author

oh it requires knowing the linear operators?

@Vaibhavdixit02
Copy link
Member

Yes

@Vaibhavdixit02 Vaibhavdixit02 deleted the ChrisRackauckas-patch-1 branch April 3, 2024 18:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

PRIMA lib errors with AutoForwardDiff
2 participants