Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add ollama client #285

Merged
merged 15 commits into from
Feb 24, 2025
Merged

feat: add ollama client #285

merged 15 commits into from
Feb 24, 2025

Conversation

451846939
Copy link
Contributor

@451846939 451846939 commented Feb 8, 2025

Background

This PR aims to address #288 by adding support for local model execution. Currently, the project only supports remote model inference, while local inference (e.g., using Ollama) offers advantages such as faster response times, enhanced data privacy, and greater control over execution.

Major Changes

  • Added Client struct: Encapsulates API calls to the Ollama local model server.

Code Structure

Key Files

  • src/providers/ollama.rs: Implements the Ollama API client.

@cvauclair cvauclair added this to the 2025-02-24 milestone Feb 10, 2025
Copy link
Contributor

@0xMochan 0xMochan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey, thanks for this PR. I want to clarify that our current ollama example serves both local and remote clients for ollama, using the openai compatibility layer, but ofc, it would be best to use the native ollama api as you've done here.

Please let me know if you have any questions about the comments I've made, thanks!

@mateobelanger mateobelanger changed the title add ollama client feat: add ollama client Feb 18, 2025
@mateobelanger mateobelanger added the model Relevant to new model providers or implementations label Feb 18, 2025
Copy link
Contributor

@0xMochan 0xMochan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Going to test locally and verify!

Copy link
Contributor

@0xMochan 0xMochan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, Ik approved this earlier, can you update the examples/agent_with_ollama to use the new client, thanks!

@451846939
Copy link
Contributor Author

Thank you! I forgot to remove this From trait. I have added some examples that I used locally to test the Ollama client during development.

Copy link
Contributor

@0xMochan 0xMochan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good (I resolved the typos). Tested great, lgtm!

@0xMochan 0xMochan merged commit 0ff042c into 0xPlaygrounds:main Feb 24, 2025
5 checks passed
@github-actions github-actions bot mentioned this pull request Feb 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
model Relevant to new model providers or implementations non-breaking
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants