-
Notifications
You must be signed in to change notification settings - Fork 421
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问 此处 ollama 应该是写我本地的配置吧 #36
Comments
你本地 ollama 要有TS 定义的模型,请求接口的地方要改成你本地有的模型 |
接口处已配置了本地模型。按照您的说法,我还需要有 TS模型 |
You may try the 'llava-llama3' on ollama. This is LM with vision. |
If you choose the 'moondream' model like the original code, you can use 'ollama pull moondream:1.8b-v2-fp16' to download the model. |
Thank you for the material you provided, I will try this today. |
The text was updated successfully, but these errors were encountered: