Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA error: Unknown error during inference #1

Open
NickLucche opened this issue Aug 29, 2022 · 3 comments
Open

CUDA error: Unknown error during inference #1

NickLucche opened this issue Aug 29, 2022 · 3 comments

Comments

@NickLucche
Copy link
Owner

NickLucche commented Aug 29, 2022

This is likely caused by an incorrect cuda version (e.g. nvidia-smi reports gpu driver 11.7, the container uses 11.3).
Hotfix:

# get inside container
docker exec -it stable-diffusion bash
# upgrade pytorch-cuda package
conda install pytorch torchvision cudatoolkit=11.6 -c pytorch -c conda-forge
# exit container 
ctrl + D
# restart container
docker restart stable-diffusion
@mchaker
Copy link

mchaker commented Sep 3, 2022

I haven't encountered this. Is it still an issue?

@NickLucche
Copy link
Owner Author

I encountered this on a 2070super card, I'll verify that once again but I think this is still an issue that's kind of hard to fix, unless we let the user choose their own pytorch+cuda package.

@mchaker
Copy link

mchaker commented Sep 17, 2022

What are the full repro instructions? I get a 2080ti next week and can test this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants