Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: PatchCore tends to go OOM easily #3

Open
SinChee opened this issue Aug 17, 2024 · 0 comments
Open

BUG: PatchCore tends to go OOM easily #3

SinChee opened this issue Aug 17, 2024 · 0 comments

Comments

@SinChee
Copy link

SinChee commented Aug 17, 2024

OOM encounter in anomalib/models/components/dimensionality_reduction/random_projection.py line 132 when trying to run python main.py --mode train --data bras2021 --model patchcore

We are trying to replicate the experiments using 8 NVIDIA RTX 3090 GPU but are faced with problem from OOM.
Screenshot 2024-08-17 at 10 50 55

Seems like there is a problem with memory allocation where it only uses 1 GPU instead of using the rest of the resources. How would one define how to train with more than 1 GPU?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant