Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CompatHelper: bump compat for OptimKit to 0.4, (keep existing compat) #122

Merged

Conversation

github-actions[bot]
Copy link
Contributor

This pull request changes the compat entry for the OptimKit package from 0.3 to 0.3, 0.4.
This keeps the compat entries for earlier versions.

Note: I have not tested your package with this new compat entry.
It is your responsibility to make sure that your package tests pass before you merge this pull request.

@lkdvos
Copy link
Member

lkdvos commented Jan 28, 2025

Requires QuantumKitHub/MPSKit.jl#239

Copy link

codecov bot commented Jan 28, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Files with missing lines Coverage Δ
src/PEPSKit.jl 87.50% <ø> (ø)

@leburgel
Copy link
Member

Seems like the failing tests are due to an unlucky random seed which now makes things much worse with the new OptimKit version somehow. I can reproduce the failures locally, and they can be solved by lowering the gradtol on the optimizer, but the convergence along the way is terrible.

Even for the cases that pass, just comparing a current CI run with a previous CI run shows it's now absolutely full of failing SVD pullbacks and fixed-point gradient computations. Just removing the seed makes things much more well-behaved (at least locally, for a few runs). So probably we could just randomize the seed again?

@lkdvos
Copy link
Member

lkdvos commented Jan 29, 2025

Does it help if we increase the linesearch maxiter? I would guess that's the only thing that really changed

@leburgel
Copy link
Member

Does it help if we increase the linesearch maxiter? I would guess that's the only thing that really changed

If anything I would decrease the maxiter, to get it out of the seemingly terrible starting place faster. The default maxiter is already pretty large for a PEPS setting. Here it really seems like it starts struggling as soon as it takes a first step and then never really recovers. Also, I guess it doesn't take much changes in OptimKit to change where we end up when optimizing starting from the old random seed.

Instead of changing seeds again, we could also try to seed everything with a simple update result systematically and see if that stabilizes things?

@leburgel
Copy link
Member

Also, getting these locally:

┌ Warning: The function `scale!!` is not implemented for (values of) type `Tuple{PEPSKit.InfinitePEPS{TensorKit.TensorMap{ComplexF64, TensorKit.GradedSpace{TensorKitSectors.FermionParity, Tuple{Int64, Int64}}, 1, 4, Vector{ComplexF64}}}, Float64}`;
│ this fallback will disappear in future versions of VectorInterface.jl
┌ Warning: The function `add!!` is not implemented for (values of) type `Tuple{PEPSKit.InfinitePEPS{TensorKit.TensorMap{ComplexF64, TensorKit.GradedSpace{TensorKitSectors.FermionParity, Tuple{Int64, Int64}}, 1, 4, Vector{ComplexF64}}}, PEPSKit.InfinitePEPS{TensorKit.TensorMap{ComplexF64, TensorKit.GradedSpace{TensorKitSectors.FermionParity, Tuple{Int64, Int64}}, 1, 4, Vector{ComplexF64}}}, Int64, VectorInterface.One}`;
│ this fallback will disappear in future versions of VectorInterface.jl
└ @ VectorInterface ~/.julia/packages/VectorInterface/J6qCR/src/fallbacks.jl:163

These are from the new VectorInterface v0.5?

@lkdvos
Copy link
Member

lkdvos commented Jan 29, 2025

Not necessarily v0.5, it's just that OptimKit default retractions etc are now using VectorInterface where previously it was not.

@pbrehmer
Copy link
Collaborator

I find it really odd that the random seed should influence the behavior of OptimKit. I think LBFGS and the line-searching should be completely deterministic anyway, right? Perhaps some parameter defaults or default behavior have changed?

Also, I'm not sure if it is really necessary to bump the OptimKit version. The plan still is to use Manopt anyway, right? In that case, I'd be happy to get #105 merged as soon as possible :)

@pbrehmer
Copy link
Collaborator

Perhaps this line makes the difference? It was changed from η = scale!(Pg, -1/normPg) meaning that the initial tangent vector is scaled differently now which might make the optimization less stable somehow.

@leburgel
Copy link
Member

leburgel commented Jan 31, 2025

Also, I'm not sure if it is really necessary to bump the OptimKit version. The plan still is to use Manopt anyway, right? In that case, I'd be happy to get #105 merged as soon as possible :)

I agree it might not really be necessary given the planned move away from OptimKit. Still, personally I would have really liked to be able to use v0.4 with PEPSKit, if only to be able to actually enforce a maxiter on the line search which would prevent a lot of problems. Just for that, I would have liked to have something compatible in the commit history.

Perhaps this line makes the difference?

I also agree that just changing the random seed to make the tests pass is very shady, and this should really not be necessary since there are no real big changes in OptimKit. Nevertheless, we have always had issues with random initial guesses (which is why the seeds were there in the first place) and it is not that farfetched that any small change can mess things up just enough to cause trouble in the tests. From just running a few times, I also wouldn't say that the new version of OptimKit makes things more unstable in general, so I don't think the change is necessarily harmful. It's just enough of a difference we're seeing it here.

Given all this, would we be opposed to just re-stabilizing the tests here (by changing seeds if necessary) and getting this merged and in the history, and then move on with merging #105? How does that sound @lkdvos @pbrehmer?

@pbrehmer
Copy link
Collaborator

pbrehmer commented Jan 31, 2025

Given all this, would we be opposed to just re-stabilizing the tests here (by changing seeds if necessary) and getting this merged and in the history, and then move on with merging #105?

I would definitely be fine with that. It would anyway be good to stabilize the test further but I'm not sure how much time I can spend on that right now. I'd be happy to help though :)

@lkdvos
Copy link
Member

lkdvos commented Jan 31, 2025

Agreed on stabilizing tests, merging this and tagging a release.

@leburgel
Copy link
Member

leburgel commented Feb 3, 2025

The simple-update-into-AD test still has some svdsolve cotangent warnings and non-converging gradients, but this seems to have always been the case. Probably nothing to worry about? Good to go for me otherwise!

@pbrehmer
Copy link
Collaborator

pbrehmer commented Feb 3, 2025

The simple-update-into-AD test still has some svdsolve cotangent warnings and non-converging gradients, but this seems to have always been the case. Probably nothing to worry about? Good to go for me otherwise!

Indeed, that was always there. I think we still don't really understand why these cotangent warnings come up but I found it hard to further stabilize that particular test. In my opinion we can leave it for now and understand and improve that in another PR.

@leburgel leburgel merged commit 4d5a516 into master Feb 3, 2025
27 checks passed
@leburgel leburgel deleted the compathelper/new_version/2025-01-28-01-10-04-750-01930399612 branch February 3, 2025 11:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants