-
Notifications
You must be signed in to change notification settings - Fork 304
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add ollama client #285
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey, thanks for this PR. I want to clarify that our current ollama
example serves both local and remote clients for ollama, using the openai compatibility layer, but ofc, it would be best to use the native ollama api as you've done here.
Please let me know if you have any questions about the comments I've made, thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Going to test locally and verify!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, Ik approved this earlier, can you update the examples/agent_with_ollama
to use the new client, thanks!
Thank you! I forgot to remove this From trait. I have added some examples that I used locally to test the Ollama client during development. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good (I resolved the typos). Tested great, lgtm!
Background
This PR aims to address #288 by adding support for local model execution. Currently, the project only supports remote model inference, while local inference (e.g., using Ollama) offers advantages such as faster response times, enhanced data privacy, and greater control over execution.
Major Changes
Client
struct: Encapsulates API calls to the Ollama local model server.Code Structure
Key Files
src/providers/ollama.rs
: Implements the Ollama API client.