Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Ollama AIX Configuration Issue #678

Open
1 task done
matrix303 opened this issue Nov 13, 2024 · 1 comment
Open
1 task done

[BUG] Ollama AIX Configuration Issue #678

matrix303 opened this issue Nov 13, 2024 · 1 comment
Labels
type: bug Something isn't working

Comments

@matrix303
Copy link

Description

Setup Ollama + bigAGI fresh install. I have followed the basic steps. Ollama has been verified to work with other apps. I added Ollama to the bigAGI as a service provider and can see the models. I am on v2-dev

When in chat mode, I try to start a chat it gives me this error in the chat UI:

[AIX Configuration Issue] Ollama: Ollama is not supported in this context

and this error in the server:

Aix.Ollama (dispatch-prepare): **[AIX Configuration Issue] Ollama**: Ollama is not supported in this context

Ollama service notes the requets but shows no error.

Would appreciate any help

Device and browser

Linux Local deployment with npm

Screenshots and more

image image

Willingness to Contribute

  • 🙋‍♂️ Yes, I would like to contribute a fix.
@matrix303 matrix303 added the type: bug Something isn't working label Nov 13, 2024
@enricoros
Copy link
Owner

enricoros commented Nov 13, 2024

True, Ollama has not been ported to AIX (big agi 2 new AI engine) yet.

Model listing is working, but ollama has some non standard protocol that needs to be implemented.

AIX has new native support for OpenAI-compatible, Anthropic, and Google protocols, but not Ollama yet.

It's likely not much work to do it, so if anyone wants to contribute it, I'd welcome a PR. I'm working on the official 2.0 release which will likely come without Ollama support at launch unless someone steps in and puts the day of work that's required to make and this.

It's also possible that Ollama has improved the OpenAI compatibility and so the support will be ready faster.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants