Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use Portkey-AI Gateway as a proxy to send requests to LLM providers #19

Open
wilderlopes opened this issue Jun 1, 2024 · 0 comments
Open
Labels
enhancement New feature or request

Comments

@wilderlopes
Copy link
Contributor

wilderlopes commented Jun 1, 2024

New feature

We would like to have a supervisor that can route requests to the appropriate LLM. miniogre can work with multiple LLMs, but currently we implement a unique API for each one of them.

The Portkey-AI Gateway repository provides a clean, easy way to proxy all the requests to LLM APIs.

Basic approach

Functions like this one in the miniogre code base, should be deprecated in favor of the gateway mentioned above. This will result in a cleaner code, easier to manage, because we won't need to write a new function every time we add a new LLM provider.

@wilderlopes wilderlopes added the enhancement New feature or request label Jun 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant