Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add regression suite for NonEmpty and update current one #92

Merged
merged 17 commits into from
Jul 15, 2018

Conversation

nobrakal
Copy link
Contributor

Hi,

I have updated the script to support the benchmarks of NonEmpty graphs and created two Travis jobs dedicated to the benchmark of the Algebra.Graph and Algebra.Graph.NonEmpty module.

The tricky part is that the benchmark suite need criterion which need many dependencies. I needed to set a timeout for the action, so it fail before the Travis time limit and allow caching these dependencies (if the time limit is reached, the is no caching).

So if the build first fail, it certainly due to benchamrks dependencies, so re-run it and, as they have been cached, the build should pass.

Example here: https://travis-ci.com/nobrakal/alga/builds/78616254

@nobrakal
Copy link
Contributor Author

(as discussed in #90 )

@snowleopard
Copy link
Owner

@nobrakal Thanks! Why did both benchmark suites fail?

Also, it's not really obvious which module is benchmarked where -- could make it visible in the Travis report by using a (possibly unused) environment setting like NonEmpty=1?

@nobrakal
Copy link
Contributor Author

It fails because of the time needed to download and build the dependencies of the bench suite. It simply takes too long. Normally, it fails only the first time, then the dependencies are cached and everything is ok after it. Please re-run the failed build to see.

I have added better variable names, I hope they are effectively better ^^

@snowleopard
Copy link
Owner

It simply takes too long

Travis has 50 min time limit, but tasks failed after 45 min. Are you forcing the termination earlier?

@snowleopard
Copy link
Owner

I have added better variable names, I hope they are effectively better

@nobrakal Yep, that's better, thank you :)

@nobrakal
Copy link
Contributor Author

Are you forcing the termination earlier ?

Yep, either Travis does not cache what was built

@snowleopard
Copy link
Owner

Well, at the moment both benchmarking suites are failing consistently. Could you drop the internal time limit? I don't quite undertand what it achieves: instead of failing to cache things, it now fails completely (and also fails to cache things because it failed!) -- so, I don't see how this makes things better :)

@nobrakal
Copy link
Contributor Author

Sorry if I wasn't clear:

If the time limit is reached, travis does not cache anything. But is a job fails, it cache. So the idea is to "manually" fail when we approach the time limit to force the cache to be done. For the current state of this PR , see: https://travis-ci.org/snowleopard/alga/jobs/402850821#L2019 .

If now you are restarting it, it should build just fine.

@snowleopard
Copy link
Owner

OK, I've restarted the jobs. Let's see how this goes...

@snowleopard
Copy link
Owner

@nobrakal Aha, it worked!

I noticed that one of the benchmarks is called creation which is a bit vague -- could you rename it to edges and edges1, which I guess is how graphs are created?

@nobrakal
Copy link
Contributor Author

Great :)

I noticed that one of the benchmarks is called creation which is a bit vague -- could you rename it to edges and edges1, which I guess is how graphs are created?

creation is not very clear, but neither is edges (it can return the list of edges in the graph). What about graphFromEdges ?

Note that I didn't replaced functions name with their *1 counterpart, since it is messy sed work, and I don't think it very necessary (the used script is already not very beautiful and readable). I can do it if want to.

@snowleopard
Copy link
Owner

creation is not very clear, but neither is edges (it can return the list of edges in the graph). What about graphFromEdges ?

I don't understand: both edges and edges1 are very precise -- they refer exactly to the functions in the API. But there are no functions named creation or graphFromEdges.

I think the regression suite should simply list the functions that have been benchmarked -- exactly how they are named in the library API.

@nobrakal
Copy link
Contributor Author

Ah sorry, I thought you was suggesting something for the graph suite itself, not this specific version.

Ok, it can be dine with some sed magic :)

@snowleopard
Copy link
Owner

Ah sorry, I thought you was suggesting something for the graph suite itself, not this specific version.

Oops, sorry for being ambiguous :) I meant just this regression suite.

@nobrakal
Copy link
Contributor Author

No problem ^^

I have updated the script, please try to restart the two corresponding travis jobs :)

(only creation, vertexList and removeVertex needed renaming)

@snowleopard
Copy link
Owner

@nobrakal Many thanks, looks good!

One more suggestion while we are at it: could you find a way to hide all build/compilation information which is not really useful for performance regression?

We should of course show all warnings etc on usual Travis instances, but for performance regression we'd like to see only performance figures, if that's possible.

@nobrakal
Copy link
Contributor Author

So I have reworked it.

Now there is a separated script sections for corresponding matrix entries. This disable all the the "classic" script and replace it by the regression suite.

It should drop unwanted outputs, and save time :)

I have also tweaked the script to hide all output.

@snowleopard
Copy link
Owner

snowleopard commented Jul 14, 2018

Hmm, for some reason Travis results are now missing (only AppVeyor is shown). Could you push another commit to try to trigger a rebuild?

@nobrakal
Copy link
Contributor Author

Ooops it was my fault, the yaml syntax is a bit tricky when it comes to new line ^^

@snowleopard
Copy link
Owner

@nobrakal We now get:

No output has been received in the last 10m0s, this potentially indicates a stalled build or something wrong with the build itself.

Perhaps, you could output something to indicate progress, e.g. just dots on a line ...?

@nobrakal
Copy link
Contributor Author

Fun that my tests ran without hitting this error !

I have corrected my script in nobrakal/benchAlgaPr@9a0547f . It output a line every 5 minutes when installing dependencies.

@snowleopard snowleopard merged commit f2d8658 into snowleopard:master Jul 15, 2018
@snowleopard
Copy link
Owner

@nobrakal Many thanks! All looks good now -- merged.

@nobrakal nobrakal deleted the addRegress branch July 17, 2018 14:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants