Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Google Gemini API support to aisuite #181

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
58 changes: 57 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Simple, unified interface to multiple Generative AI providers.
`aisuite` makes it easy for developers to use multiple LLM through a standardized interface. Using an interface similar to OpenAI's, `aisuite` makes it easy to interact with the most popular LLMs and compare the results. It is a thin wrapper around python client libraries, and allows creators to seamlessly swap out and test responses from different LLM providers without changing their code. Today, the library is primarily focussed on chat completions. We will expand it cover more use cases in near future.

Currently supported providers are -
OpenAI, Anthropic, Azure, Google, AWS, Groq, Mistral, HuggingFace Ollama, Sambanova and Watsonx.
OpenAI, Anthropic, Azure, Google, AWS, Groq, Mistral, HuggingFace Ollama, Sambanova, Watsonx, and Google Gemini.
To maximize stability, `aisuite` uses either the HTTP endpoint or the SDK for making calls to the provider.

## Installation
Expand Down Expand Up @@ -119,3 +119,59 @@ We follow a convention-based approach for loading providers, which relies on str
in providers/openai_provider.py

This convention simplifies the addition of new providers and ensures consistency across provider implementations.

## Using Google Gemini API

To use the Google Gemini API with `aisuite`, follow these steps:

### Prerequisites

1. **Google Cloud Account**: Ensure you have a Google Cloud account. If not, create one at [Google Cloud](https://cloud.google.com/).
2. **API Key**: Obtain an API key for the Google Gemini API. You can generate an API key from the [Google Cloud Console](https://console.cloud.google.com/).

### Installation

Install the `google-genai` Python client:

Example with pip:
```shell
pip install google-genai
```

Example with poetry:
```shell
poetry add google-genai
```

### Configuration

Set the `GEMINI_API_KEY` environment variable with your API key:

```shell
export GEMINI_API_KEY="your-gemini-api-key"
```

### Create a Chat Completion

In your code:
```python
import aisuite as ai
client = ai.Client()

provider = "google_genai"
model_id = "gemini-2.0-flash-exp"

messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What’s the weather like in San Francisco?"},
]

response = client.chat.completions.create(
model=f"{provider}:{model_id}",
messages=messages,
)

print(response.choices[0].message.content)
```

Happy coding! If you would like to contribute, please read our [Contributing Guide](https://github.com/andrewyng/aisuite/blob/main/CONTRIBUTING.md).
49 changes: 49 additions & 0 deletions aisuite/providers/google_genai_provider.py

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had some issues with the parsing of the name so I just changed the name to Ggenai for the class and the file.
Also the genai api doesn't accept the temperature so you must change
``**kwargs to config=types.GenerateContentConfig(**kwargs)```
in generate_content and chat_completions_create.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I’ll take a look. Appreciate any edits as I’m not too experienced integrating apis.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could i help on this ?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll get this tomorrow

Copy link

@vargacypher vargacypher Feb 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that we should keep the logic in the same provider, but making it dynamic and configurable to run using main vertex SDK OR with this new genai SDK.

What do u think ??

Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
import os
from google import genai
from google.genai import types
from aisuite.provider import Provider, LLMError
from aisuite.framework import ChatCompletionResponse


class GoogleGenaiProvider(Provider):
def __init__(self, **config):
self.api_key = config.get("api_key") or os.getenv("GEMINI_API_KEY")
if not self.api_key:
raise ValueError(
"Gemini API key is missing. Please provide it in the config or set the GEMINI_API_KEY environment variable."
)
self.client = genai.Client(api_key=self.api_key)

def chat_completions_create(self, model, messages, **kwargs):
try:
response = self.client.models.generate_content(
model=model,
contents=[message["content"] for message in messages],
**kwargs
)
return self.normalize_response(response)
except Exception as e:
raise LLMError(f"Error in chat_completions_create: {str(e)}")

def generate_content(self, model, contents, **kwargs):
try:
response = self.client.models.generate_content(
model=model,
contents=contents,
**kwargs
)
return self.normalize_response(response)
except Exception as e:
raise LLMError(f"Error in generate_content: {str(e)}")

def list_models(self):
try:
response = self.client.models.list()
return [model.name for model in response]
except Exception as e:
raise LLMError(f"Error in list_models: {str(e)}")

def normalize_response(self, response):
normalized_response = ChatCompletionResponse()
normalized_response.choices[0].message.content = response.text
return normalized_response
55 changes: 55 additions & 0 deletions guides/google_genai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
# Google Gemini API

To use the Google Gemini API with `aisuite`, follow these steps:

## Prerequisites

1. **Google Cloud Account**: Ensure you have a Google Cloud account. If not, create one at [Google Cloud](https://cloud.google.com/).
2. **API Key**: Obtain an API key for the Google Gemini API. You can generate an API key from the [Google Cloud Console](https://console.cloud.google.com/).

## Installation

Install the `google-genai` Python client:

Example with pip:
```shell
pip install google-genai
```

Example with poetry:
```shell
poetry add google-genai
```

## Configuration

Set the `GEMINI_API_KEY` environment variable with your API key:

```shell
export GEMINI_API_KEY="your-gemini-api-key"
```

## Create a Chat Completion

In your code:
```python
import aisuite as ai
client = ai.Client()

provider = "google_genai"
model_id = "gemini-2.0-flash-exp"

messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What’s the weather like in San Francisco?"},
]

response = client.chat.completions.create(
model=f"{provider}:{model_id}",
messages=messages,
)

print(response.choices[0].message.content)
```

Happy coding! If you would like to contribute, please read our [Contributing Guide](../CONTRIBUTING.md).
4 changes: 3 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ groq = { version = "^0.9.0", optional = true }
mistralai = { version = "^1.0.3", optional = true }
openai = { version = "^1.35.8", optional = true }
ibm-watsonx-ai = { version = "^1.1.16", optional = true }
google-genai = { version = "^0.1.0", optional = true }

# Optional dependencies for different providers
httpx = "~0.27.0"
Expand All @@ -30,7 +31,8 @@ mistral = ["mistralai"]
ollama = []
openai = ["openai"]
watsonx = ["ibm-watsonx-ai"]
all = ["anthropic", "aws", "google", "groq", "mistral", "openai", "cohere", "watsonx"] # To install all providers
google_genai = ["google-genai"]
all = ["anthropic", "aws", "google", "groq", "mistral", "openai", "cohere", "watsonx", "google_genai"] # To install all providers

[tool.poetry.group.dev.dependencies]
pre-commit = "^3.7.1"
Expand Down