Skip to content

Commit

Permalink
Merge branch 'main' of https://github.com/hwchase17/langchainjs into …
Browse files Browse the repository at this point in the history
…aoai-docs
  • Loading branch information
jacoblee93 committed Mar 30, 2024
2 parents 1e2ec23 + 4a5cd29 commit 410449f
Show file tree
Hide file tree
Showing 116 changed files with 4,444 additions and 504 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -45,3 +45,4 @@ docs/build/
docs/api_refs/typedoc.json

.tool-versions
credentials.json
1 change: 1 addition & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
},
"typescript.tsdk": "node_modules/typescript/lib",
"cSpell.words": [
"AILLM",
"Upstash"
],
"cSpell.enableFiletypes": [
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
sidebar_label: Google AI
sidebar_label: Google GenAI
keywords: [gemini, gemini-pro, ChatGoogleGenerativeAI]
---

Expand All @@ -11,6 +11,12 @@ You can access Google's `gemini` and `gemini-vision` models, as well as other
generative models in LangChain through `ChatGoogleGenerativeAI` class in the
`@langchain/google-genai` integration package.

:::tip
You can also access Google's `gemini` family of models via the LangChain VertexAI and VertexAI-web integrations.

Click [here](/docs/integrations/chat/google_vertex_ai) to read the docs.
:::

Get an API key here: https://ai.google.dev/tutorials/setup

You'll first need to install the `@langchain/google-genai` package:
Expand Down
99 changes: 97 additions & 2 deletions docs/core_docs/docs/integrations/chat/google_palm.mdx
Original file line number Diff line number Diff line change
@@ -1,13 +1,14 @@
---
sidebar_label: Google PaLM
sidebar_label: (Legacy) Google PaLM/VertexAI
sidebar_class_name: hidden
---

import CodeBlock from "@theme/CodeBlock";

# ChatGooglePaLM

:::note
This integration does not support `gemini-*` models. Check [Google AI](/docs/integrations/chat/google_generativeai).
This integration does not support `gemini-*` models. Check Google [GenAI](/docs/integrations/chat/google_generativeai) or [VertexAI](/docs/integrations/chat/google_vertex_ai).
:::

The [Google PaLM API](https://developers.generativeai.google/products/palm) can be integrated by first
Expand All @@ -28,3 +29,97 @@ the model.
import GooglePaLMExample from "@examples/models/chat/integration_googlepalm.ts";

<CodeBlock language="typescript">{GooglePaLMExample}</CodeBlock>

# ChatGooglePaLM

LangChain.js supports Google Vertex AI chat models as an integration.
It supports two different methods of authentication based on whether you're running
in a Node environment or a web environment.

## Setup

### Node

To call Vertex AI models in Node, you'll need to install [Google's official auth client](https://www.npmjs.com/package/google-auth-library) as a peer dependency.

You should make sure the Vertex AI API is
enabled for the relevant project and that you've authenticated to
Google Cloud using one of these methods:

- You are logged into an account (using `gcloud auth application-default login`)
permitted to that project.
- You are running on a machine using a service account that is permitted
to the project.
- You have downloaded the credentials for a service account that is permitted
to the project and set the `GOOGLE_APPLICATION_CREDENTIALS` environment
variable to the path of this file.

<IntegrationInstallTooltip></IntegrationInstallTooltip>

```bash npm2yarn
npm install google-auth-library @langchain/community
```

### Web

To call Vertex AI models in web environments (like Edge functions), you'll need to install
the [`web-auth-library`](https://github.com/kriasoft/web-auth-library) pacakge as a peer dependency:

```bash npm2yarn
npm install web-auth-library
```

Then, you'll need to add your service account credentials directly as a `GOOGLE_VERTEX_AI_WEB_CREDENTIALS` environment variable:

```
GOOGLE_VERTEX_AI_WEB_CREDENTIALS={"type":"service_account","project_id":"YOUR_PROJECT-12345",...}
```

You can also pass your credentials directly in code like this:

```typescript
import { ChatGoogleVertexAI } from "@langchain/community/chat_models/googlevertexai";

const model = new ChatGoogleVertexAI({
authOptions: {
credentials: {"type":"service_account","project_id":"YOUR_PROJECT-12345",...},
},
});
```

## Usage

Several models are available and can be specified by the `model` attribute
in the constructor. These include:

- code-bison (default)
- code-bison-32k

The ChatGoogleVertexAI class works just like other chat-based LLMs,
with a few exceptions:

1. The first `SystemMessage` passed in is mapped to the "context" parameter that the PaLM model expects.
No other `SystemMessages` are allowed.
2. After the first `SystemMessage`, there must be an odd number of messages, representing a conversation between a human and the model.
3. Human messages must alternate with AI messages.

import ChatGoogleVertexAI from "@examples/models/chat/integration_googlevertexai_legacy.ts";

<CodeBlock language="typescript">{ChatGoogleVertexAI}</CodeBlock>

### Streaming

ChatGoogleVertexAI also supports streaming in multiple chunks for faster responses:

import ChatGoogleVertexAIStreaming from "@examples/models/chat/integration_googlevertexai-streaming_legacy.ts";

<CodeBlock language="typescript">{ChatGoogleVertexAIStreaming}</CodeBlock>

### Examples

There is also an optional `examples` constructor parameter that can help the model understand what an appropriate response
looks like.

import ChatGoogleVertexAIExamples from "@examples/models/chat/integration_googlevertexai-examples_legacy.ts";

<CodeBlock language="typescript">{ChatGoogleVertexAIExamples}</CodeBlock>
82 changes: 51 additions & 31 deletions docs/core_docs/docs/integrations/chat/google_vertex_ai.mdx
Original file line number Diff line number Diff line change
@@ -1,10 +1,11 @@
---
sidebar_label: Google Vertex AI
keywords: [gemini, gemini-pro, ChatVertexAI, vertex]
---

import CodeBlock from "@theme/CodeBlock";

# ChatGoogleVertexAI
# ChatVertexAI

LangChain.js supports Google Vertex AI chat models as an integration.
It supports two different methods of authentication based on whether you're running
Expand All @@ -14,7 +15,15 @@ in a Node environment or a web environment.

### Node

To call Vertex AI models in Node, you'll need to install [Google's official auth client](https://www.npmjs.com/package/google-auth-library) as a peer dependency.
To call Vertex AI models in Node, you'll need to install the `@langchain/google-vertexai` package:

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";

<IntegrationInstallTooltip></IntegrationInstallTooltip>

```bash npm2yarn
npm install @langchain/google-vertexai
```

You should make sure the Vertex AI API is
enabled for the relevant project and that you've authenticated to
Expand All @@ -28,21 +37,19 @@ Google Cloud using one of these methods:
to the project and set the `GOOGLE_APPLICATION_CREDENTIALS` environment
variable to the path of this file.

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";

<IntegrationInstallTooltip></IntegrationInstallTooltip>

```bash npm2yarn
npm install google-auth-library @langchain/community
npm install @langchain/google-vertexai
```

### Web

To call Vertex AI models in web environments (like Edge functions), you'll need to install
the [`web-auth-library`](https://github.com/kriasoft/web-auth-library) pacakge as a peer dependency:
the `@langchain/google-vertexai-web` package:

```bash npm2yarn
npm install web-auth-library
npm install @langchain/google-vertexai-web
```

Then, you'll need to add your service account credentials directly as a `GOOGLE_VERTEX_AI_WEB_CREDENTIALS` environment variable:
Expand All @@ -51,12 +58,12 @@ Then, you'll need to add your service account credentials directly as a `GOOGLE_
GOOGLE_VERTEX_AI_WEB_CREDENTIALS={"type":"service_account","project_id":"YOUR_PROJECT-12345",...}
```

You can also pass your credentials directly in code like this:
Lastly, you may also pass your credentials directly in code like this:

```typescript
import { ChatGoogleVertexAI } from "@langchain/community/chat_models/googlevertexai";
import { ChatVertexAI } from "@langchain/google-vertexai-web";

const model = new ChatGoogleVertexAI({
const model = new ChatVertexAI({
authOptions: {
credentials: {"type":"service_account","project_id":"YOUR_PROJECT-12345",...},
},
Expand All @@ -65,37 +72,50 @@ const model = new ChatGoogleVertexAI({

## Usage

Several models are available and can be specified by the `model` attribute
in the constructor. These include:
The entire family of `gemini` models are available by specifying the `modelName` parameter.

- code-bison (default)
- code-bison-32k
For example:

The ChatGoogleVertexAI class works just like other chat-based LLMs,
with a few exceptions:
import ChatVertexAI from "@examples/models/chat/integration_googlevertexai.ts";

1. The first `SystemMessage` passed in is mapped to the "context" parameter that the PaLM model expects.
No other `SystemMessages` are allowed.
2. After the first `SystemMessage`, there must be an odd number of messages, representing a conversation between a human and the model.
3. Human messages must alternate with AI messages.
<CodeBlock language="typescript">{ChatVertexAI}</CodeBlock>

import ChatGoogleVertexAI from "@examples/models/chat/integration_googlevertexai.ts";

<CodeBlock language="typescript">{ChatGoogleVertexAI}</CodeBlock>
:::tip
See the LangSmith trace for the example above [here](https://smith.langchain.com/public/9fb579d8-4987-4302-beca-29a684ae2f4c/r).
:::

### Streaming

ChatGoogleVertexAI also supports streaming in multiple chunks for faster responses:
`ChatVertexAI` also supports streaming in multiple chunks for faster responses:

import ChatVertexAIStreaming from "@examples/models/chat/integration_googlevertexai-streaming.ts";

<CodeBlock language="typescript">{ChatVertexAIStreaming}</CodeBlock>

:::tip
See the LangSmith trace for the example above [here](https://smith.langchain.com/public/ba4cb190-3f60-49aa-a6f8-7d31316d94cf/r).
:::

### Tool calling

`ChatVertexAI` also supports calling the model with a tool:

import ChatVertexAITool from "@examples/models/chat/integration_googlevertexai-tools.ts";

<CodeBlock language="typescript">{ChatVertexAITool}</CodeBlock>

import ChatGoogleVertexAIStreaming from "@examples/models/chat/integration_googlevertexai-streaming.ts";
:::tip
See the LangSmith trace for the example above [here](https://smith.langchain.com/public/49e1c32c-395a-45e2-afba-913aa3389137/r).
:::

<CodeBlock language="typescript">{ChatGoogleVertexAIStreaming}</CodeBlock>
### `withStructuredOutput`

### Examples
Alternatively, you can also use the `withStructuredOutput` method:

There is also an optional `examples` constructor parameter that can help the model understand what an appropriate response
looks like.
import ChatVertexAIWSA from "@examples/models/chat/integration_googlevertexai-wsa.ts";

import ChatGoogleVertexAIExamples from "@examples/models/chat/integration_googlevertexai-examples.ts";
<CodeBlock language="typescript">{ChatVertexAIWSA}</CodeBlock>

<CodeBlock language="typescript">{ChatGoogleVertexAIExamples}</CodeBlock>
:::tip
See the LangSmith trace for the example above [here](https://smith.langchain.com/public/41bbbddb-f357-4bfa-a111-def8294a4514/r).
:::
1 change: 1 addition & 0 deletions docs/core_docs/docs/integrations/chat/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ The table shows, for each integration, which features have been implemented with
| ChatFireworks | βœ… | βœ… | βœ… | βœ… | ❌ | ❌ |
| ChatGoogleGenerativeAI | βœ… | βœ… | βœ… | ❌ | ❌ | ❌ |
| ChatGoogleVertexAI | βœ… | βœ… | βœ… | ❌ | ❌ | ❌ |
| ChatVertexAI | βœ… | βœ… | βœ… | ❌ | βœ… | βœ… |
| ChatGooglePaLM | βœ… | ❌ | βœ… | ❌ | ❌ | ❌ |
| ChatGroq | βœ… | βœ… | βœ… | ❌ | ❌ | ❌ |
| ChatLlamaCpp | βœ… | βœ… | βœ… | ❌ | ❌ | ❌ |
Expand Down
Loading

0 comments on commit 410449f

Please sign in to comment.