Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Perception CUDA out of memory #102

Closed
okrusch opened this issue Nov 27, 2023 · 1 comment
Closed

[Bug]: Perception CUDA out of memory #102

okrusch opened this issue Nov 27, 2023 · 1 comment
Assignees
Labels
bug Something isn't working perception

Comments

@okrusch
Copy link
Collaborator

okrusch commented Nov 27, 2023

Current Behavior

When i try to run the model of the vision node on CUDA, i get a CUDA out of memory error.

Expected Behavior

It should be possible to use CUDA if available.

How to reproduce the issue

For now the device is set to CPU, but you can change it in vision_node.py

The issue includes updating the torch and cuda version

@okrusch okrusch added bug Something isn't working perception labels Nov 27, 2023
@okrusch okrusch self-assigned this Nov 27, 2023
@MaxJa4
Copy link
Collaborator

MaxJa4 commented Nov 27, 2023

Does this also happen when using FP16 precision for inference? Or is the error occurring despite having enough VRAM?

@okrusch okrusch moved this to 📋 Backlog in PAF Project Backlog Nov 27, 2023
@okrusch okrusch moved this from 📋 Backlog to 🔖 Ready in PAF Project Backlog Nov 29, 2023
@okrusch okrusch moved this from 🔖 Ready to 🏗 In progress in PAF Project Backlog Nov 30, 2023
@okrusch okrusch moved this from 🏗 In progress to 👀 In review in PAF Project Backlog Dec 1, 2023
@okrusch okrusch moved this from 👀 In review to ✅ Done in PAF Project Backlog Dec 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working perception
Projects
Archived in project
Development

No branches or pull requests

3 participants