Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix: Set default average='micro' consistently across classification metrics (some of the subclasses of MulticlassStatScores, MultilabelStatScores) #2882

Open
wants to merge 7 commits into
base: master
Choose a base branch
from

Conversation

rittik9
Copy link
Contributor

@rittik9 rittik9 commented Dec 24, 2024

What does this PR do?

Partially Fixes #2320

Before submitting
  • Was this discussed/agreed via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?
PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Changed the doctest values for subclasses of MulticlassStatScores, MultilabelStatScores for average = 'micro' as earlier default value was average = 'macro'

Did you have fun?

Make sure you had fun coding 🙃


📚 Documentation preview 📚: https://torchmetrics--2882.org.readthedocs.build/en/2882/

@github-actions github-actions bot added documentation Improvements or additions to documentation topic: Classif labels Dec 24, 2024
@rittik9 rittik9 marked this pull request as draft December 24, 2024 13:18
@rittik9 rittik9 marked this pull request as ready for review December 24, 2024 17:56
Copy link

codecov bot commented Dec 24, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 41%. Comparing base (767c678) to head (d19a1eb).

❌ Your project check has failed because the head coverage (41%) is below the target coverage (95%). You can increase the head coverage or adjust the target coverage.

❗ There is a different number of reports uploaded between BASE (767c678) and HEAD (d19a1eb). Click for more details.

HEAD has 4 uploads less than BASE
Flag BASE (767c678) HEAD (d19a1eb)
gpu 2 0
unittest 2 0
Additional details and impacted files
@@           Coverage Diff            @@
##           master   #2882     +/-   ##
========================================
- Coverage      69%     41%    -28%     
========================================
  Files         346     332     -14     
  Lines       19172   18996    -176     
========================================
- Hits        13236    7742   -5494     
- Misses       5936   11254   +5318     

@rittik9 rittik9 marked this pull request as draft December 24, 2024 18:54
@rittik9 rittik9 marked this pull request as ready for review December 24, 2024 20:07
@rittik9 rittik9 changed the title Fix: Set default average='micro' consistently across accuracy metrics Fix: Set default average='micro' consistently across classification metrics ( subclasses of MulticlassStatScores, MultilabelStatScores) Dec 24, 2024
@rittik9 rittik9 changed the title Fix: Set default average='micro' consistently across classification metrics ( subclasses of MulticlassStatScores, MultilabelStatScores) Fix: Set default average='micro' consistently across classification metrics (some of the subclasses of MulticlassStatScores, MultilabelStatScores) Dec 25, 2024
@rittik9 rittik9 force-pushed the rittik/avg branch 2 times, most recently from 45a06d4 to b545a3a Compare December 31, 2024 14:40
Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I undestand the mostivation and it sounds good at fist look, just for the context we tried to have the same default behavior as SK-learn so that is wy we do not have the same averaging everywhere... but open to chat about it and eventually make this breaking change...

cc: @lantiga @SkafteNicki

@rittik9 rittik9 force-pushed the rittik/avg branch 5 times, most recently from f6d659a to 8b1ad0c Compare January 8, 2025 04:52
@mergify mergify bot removed the has conflicts label Jan 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation topic: Classif
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Inconsistent default values for average argument in classification metrics
2 participants