Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run JuliaInterface tests as part of Pkg.test? #1005

Closed
lgoettgens opened this issue Jun 18, 2024 · 8 comments · Fixed by #1014
Closed

Run JuliaInterface tests as part of Pkg.test? #1005

lgoettgens opened this issue Jun 18, 2024 · 8 comments · Fixed by #1014

Comments

@lgoettgens
Copy link
Member

or at least in PkgEval runs?
In JuliaLang/julia#54678, a PkgEval job was ran, and PRs to all broken packages were supplied. However, since this only ran the GAP.jl tests and not those of JuliaInterface, #1004 was not found there.
I think it would make sense to have the JuliaInterface tests being run as part of PkgEval jobs to make the julia devs aware of possible future breakage they introduce.
What do you think?

@fingolfin
Copy link
Member

I have no idea how to run tests "at least in PkgEval runs".

If running JuliaInterface tests as part of Pkg.test seems useful, sure, why not (PR welcome ;-) ). But we also will want to keep the current test for them because they (a) test the generated gap.sh wrappers, and (b) provide code coverage information.

@lgoettgens
Copy link
Member Author

I have no idea how to run tests "at least in PkgEval runs".

Iirc they set some environment variable. I'll look into this later

@benlorenz
Copy link
Member

We already use haskey(ENV,"JULIA_PKGEVAL") in some places.

@ThomasBreuer
Copy link
Member

My interpretation of the proposal is as follows:

We extend the file GAP.jl/test/runtests.jl by a @test line that executes the contents of GAP.jl/pkg/JuliaInterface/tst/testall.g, except that GAP shall not be quit in the end, and that we need a return value.
This additional @test line shall be executed only under the condition haskey(ENV, "JULIA_PKGEVAL").

If this is correct then GAP.jl/test/runtests.jl should be modified such that it can be used in both situations, i. e., when it is called from GAP or from Julia.

@fingolfin
Copy link
Member

I would prefer not to call JULIA_PKGEVAL and just always run those test @test lines: they add very little extra time for the tests, and doing it this way is much easier to verify.

To ensure GAP does not quit at the end of tst/testall.g, I see two options:

  1. modify the two testall.g files to look for some global flag (e.g. check if DONT_QUIT_AFTER_TEST is bound and set to true) to modify the exitGAP flag value (and to also not execute the final FORCE_QUIT_GAP). Then we still need the return value of TestDirectory which could be assigned to another global variable.
  2. modify the two testall.g files by moving most of their content into a separate file, say tst/setup.g, then change testall.g to read setup.g and afterwards call TestDirectory/FORCE_QUIT_GAP. Then on the Julia side we can read setup.g and call GAP.Globals. TestDirectory with suitable arguments.

Variant 2 doesn't require messing with global variables.

Ah and then there is a third variant: let the test do what the CI tests do, and launch a new GAP (resp. GAP-in-Julia) which runs the package tests; then check its exit code to determine the result.

Either is fine by me as long as someone else implements it ;-)

@ThomasBreuer
Copy link
Member

I had thought about something like variant 2. And running these tests unconditionally is fine.

@lgoettgens
Copy link
Member Author

I have something like 1 and 3 in some local branches, but didn't manage to get it working yesterday. Let me just push them here later today so that you guys can have a look at the code

@lgoettgens
Copy link
Member Author

I have something like 1 and 3 in some local branches, but didn't manage to get it working yesterday. Let me just push them here later today so that you guys can have a look at the code

I now created PRs #1014 for alternative 3 and #1015 for alternative 2.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants