You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello!
I trained two models, one using the nnU-Net framework and the other using nnDetection, with the same training data. When comparing inference times, I disabled the number of test-time augmentations for both models.
What I found is that nnU-Net is approximately 3x faster than nnDetection during inference. Does anyone have an explanation for this? It seems to me that the preprocessing step in nnDetection could be part of the issue, but I’m not entirely sure.
Has anyone else observed something similar, or does anyone know what might be causing this difference?
Thanks in advance!
The text was updated successfully, but these errors were encountered:
Hello, yes, the preprocessing step in nnDetection could be slow. Also the different boxes predicted by the network need to be post-processed to remove duplicates, remove boxes with very low prediction scores, etc. which are steps that are not completed in nnU-Net
❓ Question
Hello!
I trained two models, one using the nnU-Net framework and the other using nnDetection, with the same training data. When comparing inference times, I disabled the number of test-time augmentations for both models.
What I found is that nnU-Net is approximately 3x faster than nnDetection during inference. Does anyone have an explanation for this? It seems to me that the preprocessing step in nnDetection could be part of the issue, but I’m not entirely sure.
Has anyone else observed something similar, or does anyone know what might be causing this difference?
Thanks in advance!
The text was updated successfully, but these errors were encountered: