Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

modernize testing code #3743

Open
orbeckst opened this issue Jun 30, 2022 · 16 comments
Open

modernize testing code #3743

orbeckst opened this issue Jun 30, 2022 · 16 comments

Comments

@orbeckst
Copy link
Member

orbeckst commented Jun 30, 2022

Is your feature request related to a problem?

The testing code uses pytest and relies on numpy.testing. It contains some deprecated testing constructs such as assert_almost_equal(), assert_array_almost_equal() and some sub-optimal tests (such as testing single floats with an array assertion).

Describe the solution you'd like

Update the testing code, following current best practices, such as

  1. use numpy.testing.assert_allclose() instead of assert_almost_equal() or assert_array_almost_equal() for more consistent floating point comparisons
  2. replace array comparisons for floating point scalars (a single floating point number) with pytest.approx() (see Nuclinfo Major Pair and Minor Pair overhaul #3735 (comment))
  3. ...

Please add additional best practices here or in the comments.

Describe alternatives you've considered

We don't have to do anything right now because everything is working.

How to work on this issue

You can open issues for individual points above, you don't have to work on all of them. If you create a new issue, mention this issue number in the issue text so that it gets linked.

Find occurrences of the constructs that you want to change in the code, e.g.,

cd testsuite/
git grep 'assert_almost_equal'

Figure out how to change the code to get an equivalent result by learning about the different arguments that the old and the new function take and how they affect the assertion. Then pick one example and implement your solution. See if it works. Improve your example until it works. Only then start converting more of the test files. Get feedback from other developers.

@utkarsh147-del
Copy link

i want to work on this issue

@orbeckst
Copy link
Member Author

orbeckst commented Jul 9, 2022

We always welcome contributions. Just submit a PR that references this issue. We will then discuss there. When you start the work, please also introduce you on the developer mailing list.

Btw, we don’t “reserve” work, we just look at the first PR that addresses the issue. Details of our policy are in our GSOC FAQ on our wiki.

@rzhao271
Copy link
Contributor

I tried converting more tests to use assert_allclose instead of assert_almost_equals, but an issue I've run into is that for scalars, pytest.approx doesn't work well when the original test had a custom error message.
Here's how the signature looks for assert_almost_equals:
assert_almost_equals(a, b, rtol=0, atol=something, err_msg=something)
But for pytest.approx, one uses it as follows:
assert a == approx(b, abs=something)
I'm wondering if I can just use assert_allclose instead? Otherwise I'd have to write something like

if a != approx(b):
    # print the custom error message here and fail the test

for those tests.

@orbeckst
Copy link
Member Author

Does

assert a == approx(b), “error message”

work?

@Ranit-Bandyopadhyay
Copy link

hello, i want to work on this project.

@orbeckst
Copy link
Member Author

Hello @Ranit-Bandyopadhyay , if no-body else is working on an issue as evidenced by a linked PR, just go ahead an submit a PR. See also our GSOC FAQ https://github.com/MDAnalysis/mdanalysis/wiki/GSoC-FAQ#can-i-claim-an-issue-to-work-on .

@chandra20500
Copy link

Hi, I would like to work on this Issue. it would be great could you please assign this issue to me

@orbeckst
Copy link
Member Author

@chandra20500 , see what I said above and read the link to the FAQ: we do not reserve or assign issues. We just evaluate work presented.

@utkarsh147-del
Copy link

can u tell me the link of testing code

@hmacdope
Copy link
Member

@utkarsh147-del you can find the link by reading the User Guide

@yickysan
Copy link

Hello, so I made a couple of changes to the test_align.py file. For starters I replaced the numpy.testing.assert_almost_equal function with numpy.testing.assert_allclose which according to the numpy documentation has a more consistent performance. I also used the pytest.approx function for comparing scalar and non-array like values. I created a pull request and my code passed all the tests except for the dark_lint test, I want to ask if there is a reason for it?

@JoStoe
Copy link
Contributor

JoStoe commented Sep 18, 2023

I'm developing a reader for the GROMOS11 trajectory format, as that would enable the research group I'm working with to use MDAnalysis (Issue #4291).

When preparing the necessary pytests, I found the documentation/userguide to ask for the use of assert_almost_equal().

In the pytests for the GROMOS reader, I compare the read-in position of one atom with the numeric exact position of the input file. As soon as the read-in positions are saved in the Timestep instance (ts.positions), they are converted to np.float32. The conversion causes a loss of precision in the 5th decimal, which would be acceptable for me.

This numeric error is caught by assert_almost_equal(), failing the tests.
Contrary to that, assert_allclose() passes.

I use both assert_almost_equal and assert_allclose without further arguments.

Is it valid to use assert_allclose as recommend here, even through the documentation requests otherwise? If yes, I would ask to also change the documentation (https://userguide.mdanalysis.org/2.6.0/testing.html).

@JoStoe JoStoe mentioned this issue Sep 18, 2023
5 tasks
@orbeckst
Copy link
Member Author

@JoStoe use assert_allclose(). There's MDAnalysis/UserGuide#171 for updating the developer docs and the User Guide gladly accepts any help it can get — please feel very much invited to make a small PR to fix the docs.

@man0045
Copy link

man0045 commented Nov 12, 2023

i love to contribute in this project can you please assign me the issue

@RMeli
Copy link
Member

RMeli commented Nov 12, 2023

Hi @man0045, great to hear you want to contribute to MDAnalysis! However, we do bot assign issues. If no-body else is working on an issue (as evidenced by a linked PR), you can work on it an submit a PR.

@aditya292002
Copy link
Contributor

Hi, maintainers I have made raised a PR for the above.
Modified files

  • mdanalysis/testsuite/MDAnalysisTests/analysis/test_align.py
  • /mdanalysis/package/MDAnalysis/visualization/streamlines_3D.py

Changes

  • Replaced all of the assert_almost_equal assertions to assert_allclose for for more consistent floating point comparison, as mentioned in the above issue.

Should I do the same for rest of the files in the project.

orbeckst pushed a commit that referenced this issue Feb 24, 2024
…lclose (#4438)

* fixes part of #3743 
* replaced assert_almost_equal() with assert_allclose() for  more consistent floating point comparisons;
   replaced decimal={N} with atol=0 rtol=1.5e-{N}
* updated authors file and changelog
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests