You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a model that I usually have to use in llama-cpp because of the proper number of GPU offloading.
Example, in the model in the image below, LM Studio allows me in configuration to select GPU offloading from 0 to 40.
If I select 40, I see in the logs, that it offloaded 41/41 layers to GPU.
If I select 39 in the configuration, it shows in the logs that it offloaded 39/41 layers to GPU.
How do I offload 40/41 layers to the GPU? I can do this in llama-cpp directly.
Thanks for LM Studio!
The text was updated successfully, but these errors were encountered:
Hi,
I have a model that I usually have to use in llama-cpp because of the proper number of GPU offloading.
Example, in the model in the image below, LM Studio allows me in configuration to select GPU offloading from 0 to 40.
If I select 40, I see in the logs, that it offloaded 41/41 layers to GPU.
If I select 39 in the configuration, it shows in the logs that it offloaded 39/41 layers to GPU.
How do I offload 40/41 layers to the GPU? I can do this in llama-cpp directly.
Thanks for LM Studio!
The text was updated successfully, but these errors were encountered: