Possible to run SDXL via Colab Pro (not Pro+)? #2412
WorldofDepth
started this conversation in
General
Replies: 3 comments 9 replies
-
set the runtime shape to High-RAM, in the menu "runtime" |
Beta Was this translation helpful? Give feedback.
3 replies
-
Same. Set High-Ram can't solve the problem. |
Beta Was this translation helpful? Give feedback.
0 replies
-
test run it without the first cell to make sure the issue isn't caused by some extension |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, does anyone know if this is possible, perhaps by using the --medvram / --lowvram options? I don't know where to insert those in the notebook to try them out (tried to put them after instances of webui.py in the last code block, but that did not work).
Colab Pro gives you 12.7GB of system RAM / GPU RAM, and loading SDXL in the latest notebook doesn't appear to use more than 7GB RAM at any moment, but it gives ^C before creating the model / loading weights finishes.
If this is not possible, what are people's recommendations for alternatives, when running locally is not an option? Thanks for any help!
Beta Was this translation helpful? Give feedback.
All reactions