Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TST: test_streamplot_2D faster #4221

Merged

Conversation

tylerjereddy
Copy link
Member

@tylerjereddy tylerjereddy commented Aug 1, 2023

  • speed up test_streamplot_2D() because it has been reported to take ~20 seconds on a regular basis
    in CI in [CI] Tests timing out occasionally #4209

  • we don't really need to plot the output, which was taking most of the time, and instead we can just check the data structures that MDAnalysis returns (this may be a better test by some definitions anyway...); I suppose we could also spot check a few values in the arrays if we wanted as well

  • locally, that single test seems to run in 0.39 s on this branch vs. 4.7 s on develop

Developers certificate of origin


📚 Documentation preview 📚: https://mdanalysis--4221.org.readthedocs.build/en/4221/

* speed up `test_streamplot_2D()` because it has been
reported to take ~20 seconds on a regular basis
in CI in MDAnalysisgh-4209

* we don't really need to plot the output, which
was taking most of the time, and instead we can
just check the data structures that MDAnalysis
returns (this may be a better test by some definitions
anyway...); I suppose we could also spot check
a few values in the arrays if we wanted as well

* locally, that single test seems to run in 0.39 s
on this branch vs. 4.7 s on `develop`
with open(plot_outpath, 'rb'):
pass
assert u1.shape == (5, 5)
assert v1.shape == (5, 5)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe spot checking values here might make sense--to be fair, we weren't doing that before either though

@github-actions
Copy link

github-actions bot commented Aug 1, 2023

Linter Bot Results:

Hi @tylerjereddy! Thanks for making this PR. We linted your code and found the following:

Some issues were found with the formatting of your code.

Code Location Outcome
main package ✅ Passed
testsuite ⚠️ Possible failure

Please have a look at the darker-main-code and darker-test-code steps here for more details: https://github.com/MDAnalysis/mdanalysis/actions/runs/5755964917/job/15604492736


Please note: The black linter is purely informational, you can safely ignore these outcomes if there are no flake8 failures!

@codecov
Copy link

codecov bot commented Aug 1, 2023

Codecov Report

Patch coverage has no change and project coverage change: -0.01% ⚠️

Comparison is base (66a2b21) 93.62% compared to head (4a6fbdf) 93.62%.
Report is 4 commits behind head on develop.

Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #4221      +/-   ##
===========================================
- Coverage    93.62%   93.62%   -0.01%     
===========================================
  Files          193      193              
  Lines        25295    25295              
  Branches      4063     4063              
===========================================
- Hits         23683    23682       -1     
  Misses        1096     1096              
- Partials       516      517       +1     

see 1 file with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Member

@IAlibay IAlibay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! If you want to add a few value spot checks that'd be great, otherwise it at least doesn't regress the amount of testing we do.

Just the one think about using pytest.approx for single float value checks.

pass
assert u1.shape == (5, 5)
assert v1.shape == (5, 5)
assert_allclose(avg, 0.965194167)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are these single value checks? If so could we use pytest.approx instead (it is much faster last I checked)?

* use `pytest.approx` for single value comparisons

* `u1` and `v1` are now checked more thoroughly for
their actual floating point values
@tylerjereddy
Copy link
Member Author

I revised to use pytest.approx for the single value checks (didn't see a perf change either way), and use assert_allclose to just check all the values in u1, v1, since they're fairly small anyway (still no perf change for me locally).

Copy link
Member

@IAlibay IAlibay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@IAlibay
Copy link
Member

IAlibay commented Aug 9, 2023

/azp run

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@IAlibay
Copy link
Member

IAlibay commented Aug 10, 2023

/azp run

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@IAlibay IAlibay merged commit 1eca655 into MDAnalysis:develop Aug 10, 2023
22 of 23 checks passed
@tylerjereddy tylerjereddy deleted the treddy_test_streamplot_2D_speed branch August 10, 2023 22:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants