Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add google support for llm-client (aka llm-polyglot) #69

Closed
wants to merge 10 commits into from

Conversation

jcamera
Copy link

@jcamera jcamera commented Jul 11, 2024

llm-polyglot support for Google AI

This is using Google AI SDK (aka Gemini API) for Javascript - https://github.com/google-gemini/generative-ai-js

Be sure to set GOOGLE_API_KEY in your environment for use.

Features added:

  • Chat completion with support for message arrays conforming to OpenAI spec
    • google content array "parts" supports "model" (instead of "assistant") and "user", while system messages are may be included in the systemInstruction field
  • Support for streaming chat completion
  • Function calling - streaming and non-streaming
  • Support for Google AI Cache Manager - cache data may be added through createCacheManager and used by passing in the cache name from the response into chat.completions.create by setting it in { additionalProperties: cacheName }} (note: this requires a paid account)

Note: polyfill added due to missing support for TextEncoderStream in bun.js (oven-sh/bun#5648)

we may also consider using their Vertex AI API which has some tentative support for using OpenAI libraries with Gemini - https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/call-gemini-using-openai-library

or directly using their Gemini Rest API - https://ai.google.dev/api/rest

Feedback is welcome :-)

Copy link

changeset-bot bot commented Jul 11, 2024

⚠️ No Changeset found

Latest commit: 7a91550

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

Copy link

vercel bot commented Jul 11, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
island-ai-docs ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jul 16, 2024 5:00pm

@ZECTBynmo
Copy link

Cool stuff @jcamera! Have you thought about how this might interact with/support the caching features that Gemini has? It'd be an awesome capability to add.

@jcamera
Copy link
Author

jcamera commented Jul 13, 2024

Cool stuff @jcamera! Have you thought about how this might interact with/support the caching features that Gemini has? It'd be an awesome capability to add.

Hey thank you @ZECTBynmo! I'm just getting my feet wet here, but I did look at an example using GoogleAICacheManager. We could totally add methods on the Google provider to support this, and use the same input data types as the chat completion.
And we could then add a param on chat.completions.create like { cache: name } using the name returned when the cache is set. This would diverge from the OpenAI spec a bit, but shouldn't hurt anything as an extra optional field. If OpenAI eventually adds a cache field we'd need to update. :-)
I can start adding this into this PR unless any objections.

@ZECTBynmo
Copy link

I can start adding this into this PR unless any objections.

Awesome!

@jcamera
Copy link
Author

jcamera commented Jul 15, 2024

Hi @ZECTBynmo, fyi I added some pieces to allow using the Google cache manager. (A paid account is needed to use it) Added an example in /examples. I'd be curious to learn about your use cases, and happy to help if any issues! 🙂

@jcamera jcamera marked this pull request as ready for review July 15, 2024 15:00
@jcamera jcamera changed the title [WIP] adding google support for llm-client (aka llm-polyglot) add google support for llm-client (aka llm-polyglot) Jul 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants