Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error with ollama and openai adapters #79

Open
matiasmolinas opened this issue Jan 16, 2025 · 2 comments
Open

Error with ollama and openai adapters #79

matiasmolinas opened this issue Jan 16, 2025 · 2 comments
Labels
question Further information is requested

Comments

@matiasmolinas
Copy link

matiasmolinas commented Jan 16, 2025

Hi, after trying the latest version of bee-stack with Ollama and OpenAI, I am unable to use Bee-Stack on an Ubuntu machine with 32GB of RAM. I encountered different errors in each case. I have verified that Ollama is running properly and that the OpenAI key is functioning correctly.

Attached are screenshots and logs for each case.
openai
ollama
2025-01-16_1004S.zip

@Tomas2D Tomas2D transferred this issue from i-am-bee/bee-agent-framework Jan 17, 2025
@Tomas2D
Copy link
Contributor

Tomas2D commented Jan 17, 2025

Please try to pull the latest `bee-stack and run the following commands

./bee-stack.sh clean
./bee-stack.sh setup
./bee-stack.sh start

@Tomas2D Tomas2D added the question Further information is requested label Jan 17, 2025
@Tomas2D
Copy link
Contributor

Tomas2D commented Jan 17, 2025

The default model that API uses is llama3.1, ensure you have it; run ollama pull llama3.1 or create a new agent with a custom model (https://github.com/i-am-bee/bee-stack?tab=readme-ov-file#custom-models).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants