Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] nnU-Net vs nnDetection Inference Time #292

Open
tristanrauhut opened this issue Jan 23, 2025 · 1 comment
Open

[Question] nnU-Net vs nnDetection Inference Time #292

tristanrauhut opened this issue Jan 23, 2025 · 1 comment

Comments

@tristanrauhut
Copy link

❓ Question

Hello!
I trained two models, one using the nnU-Net framework and the other using nnDetection, with the same training data. When comparing inference times, I disabled the number of test-time augmentations for both models.
What I found is that nnU-Net is approximately 3x faster than nnDetection during inference. Does anyone have an explanation for this? It seems to me that the preprocessing step in nnDetection could be part of the issue, but I’m not entirely sure.
Has anyone else observed something similar, or does anyone know what might be causing this difference?
Thanks in advance!

@andres2631996
Copy link

Hello, yes, the preprocessing step in nnDetection could be slow. Also the different boxes predicted by the network need to be post-processed to remove duplicates, remove boxes with very low prediction scores, etc. which are steps that are not completed in nnU-Net

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants