You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Setup Ollama + bigAGI fresh install. I have followed the basic steps. Ollama has been verified to work with other apps. I added Ollama to the bigAGI as a service provider and can see the models. I am on v2-dev
When in chat mode, I try to start a chat it gives me this error in the chat UI:
[AIX Configuration Issue] Ollama: Ollama is not supported in this context
and this error in the server:
Aix.Ollama (dispatch-prepare): **[AIX Configuration Issue] Ollama**: Ollama is not supported in this context
Ollama service notes the requets but shows no error.
Would appreciate any help
Device and browser
Linux Local deployment with npm
Screenshots and more
Willingness to Contribute
🙋♂️ Yes, I would like to contribute a fix.
The text was updated successfully, but these errors were encountered:
True, Ollama has not been ported to AIX (big agi 2 new AI engine) yet.
Model listing is working, but ollama has some non standard protocol that needs to be implemented.
AIX has new native support for OpenAI-compatible, Anthropic, and Google protocols, but not Ollama yet.
It's likely not much work to do it, so if anyone wants to contribute it, I'd welcome a PR. I'm working on the official 2.0 release which will likely come without Ollama support at launch unless someone steps in and puts the day of work that's required to make and this.
It's also possible that Ollama has improved the OpenAI compatibility and so the support will be ready faster.
Description
Setup Ollama + bigAGI fresh install. I have followed the basic steps. Ollama has been verified to work with other apps. I added Ollama to the bigAGI as a service provider and can see the models. I am on v2-dev
When in chat mode, I try to start a chat it gives me this error in the chat UI:
and this error in the server:
Ollama service notes the requets but shows no error.
Would appreciate any help
Device and browser
Linux Local deployment with npm
Screenshots and more
Willingness to Contribute
The text was updated successfully, but these errors were encountered: