Skip to content

Commit

Permalink
Added intro for Vertex AI library
Browse files Browse the repository at this point in the history
  • Loading branch information
SauravP97 committed Dec 22, 2024
1 parent a0f1b5a commit d566eb6
Show file tree
Hide file tree
Showing 2 changed files with 106 additions and 23 deletions.
106 changes: 106 additions & 0 deletions libs/vertexai/langchain_google_vertexai/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,109 @@
"""
## langchain-google-vertexai
This module contains the LangChain integrations for Google Cloud generative models.
## Installation
```bash
pip install -U langchain-google-vertexai
```
## Chat Models
`ChatVertexAI` class exposes models such as `gemini-pro` and `chat-bison`.
To use, you should have Google Cloud project with APIs enabled, and configured credentials. Initialize the model as:
```python
from langchain_google_vertexai import ChatVertexAI
llm = ChatVertexAI(model_name="gemini-pro")
llm.invoke("Sing a ballad of LangChain.")
```
You can use other models, e.g. `chat-bison`:
```python
from langchain_google_vertexai import ChatVertexAI
llm = ChatVertexAI(model_name="chat-bison", temperature=0.3)
llm.invoke("Sing a ballad of LangChain.")
```
#### Multimodal inputs
Gemini vision model supports image inputs when providing a single chat message. Example:
```python
from langchain_core.messages import HumanMessage
from langchain_google_vertexai import ChatVertexAI
llm = ChatVertexAI(model_name="gemini-pro-vision")
# example
message = HumanMessage(
content=[
{
"type": "text",
"text": "What's in this image?",
}, # You can optionally provide text parts
{"type": "image_url", "image_url": {"url": "https://picsum.photos/seed/picsum/200/300"}},
]
)
llm.invoke([message])
```
The value of `image_url` can be any of the following:
- A public image URL
- An accessible gcs file (e.g., "gcs://path/to/file.png")
- A base64 encoded image (e.g., `data:image/png;base64,abcd124`)
## Embeddings
You can use Google Cloud's embeddings models as:
```python
from langchain_google_vertexai import VertexAIEmbeddings
embeddings = VertexAIEmbeddings()
embeddings.embed_query("hello, world!")
```
## LLMs
You can use Google Cloud's generative AI models as Langchain LLMs:
```python
from langchain_core.prompts import PromptTemplate
from langchain_google_vertexai import ChatVertexAI
template = \"""Question: {question}
Answer: Let's think step by step.\"""
prompt = PromptTemplate.from_template(template)
llm = ChatVertexAI(model_name="gemini-pro")
chain = prompt | llm
question = "Who was the president of the USA in 1994?"
print(chain.invoke({"question": question}))
```
You can use Gemini and Palm models, including code-generations ones:
```python
from langchain_google_vertexai import VertexAI
llm = VertexAI(model_name="code-bison", max_output_tokens=1000, temperature=0.3)
question = "Write a python function that checks if a string is a valid email address"
output = llm(question)
```
"""

from google.cloud.aiplatform_v1beta1.types import (
FunctionCallingConfig,
FunctionDeclaration,
Expand Down
23 changes: 0 additions & 23 deletions libs/vertexai/langchain_google_vertexai/chat_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -442,29 +442,6 @@ def _parse_content(raw_content: str | Dict[Any, Any]) -> Dict[Any, Any]:


def _parse_examples(examples: List[BaseMessage]) -> List[InputOutputTextPair]:
"""Parse the list of examples. The method expects the examples to be in the order of Human Message followed by an AI Message.
Args:
examples: The list of examples to be parsed
Returns:
A parsed example list.
Raises:
ValueError:
- If an odd number of examples are given as argument.
- If an instance of Human Message is not found at every even index in the input examples list.
- If an instance of AI Message is not found at every odd index in the input examples list.
A valid list examples can be as follows:
.. code-block:: python
examples = [
HumanMessage(content = "A first sample Human Message"),
AIMessage(content = "A first sample AI Message"),
HumanMessage(content = "A second sample Human Message"),
AIMessage(content = "A second sample AI Message"),
]
"""

if len(examples) % 2 != 0:
raise ValueError(
f"Expect examples to have an even amount of messages, got {len(examples)}."
Expand Down

0 comments on commit d566eb6

Please sign in to comment.