Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Try reenabling PETab tutorial #1089

Closed
wants to merge 1 commit into from
Closed

Conversation

isaacsas
Copy link
Member

It would be nice to get this back into the docs.

@isaacsas
Copy link
Member Author

@sebapersson should PETab work with the latest Catalyst and MTK now?

@isaacsas
Copy link
Member Author

Ahh, I see PEtab doesn't support the latest optimization yet which we've updated to, so this isn't going to work.

@isaacsas isaacsas closed this Oct 25, 2024
@isaacsas isaacsas deleted the try-reenabling-petab-tutorial branch October 25, 2024 14:02
@sebapersson
Copy link
Contributor

Agree, it would be nice to have this back.

PEtab works with latest Catalyst and MTK. As for Optimization I had it at 3 because of SciMLSensitivity (which I now see have updated the compat). I will try to bump Optimization and then it should be possible to reenable the tutorials. Will give a ping here when PEtab is updated

@isaacsas
Copy link
Member Author

Awesome, let us know and we will reenable as soon as a release that works with the bumped optimizations libs is out.

@ChrisRackauckas
Copy link
Member

If you're not using the data iterator or the extra cost function argument results it should just be a quick bump. I have a few tutorial bumps to look into (DiffEqFlux, SciMLSensitivty), but they've now made it to the top of my inbox so if you need help, see those PRs from Vaibhav before I get the merge.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants