Skip to content

Commit

Permalink
Merge branch 'main' into falkordb_graph
Browse files Browse the repository at this point in the history
  • Loading branch information
gkorland authored Dec 15, 2024
2 parents 4ff411d + 2cdf57c commit 08a70d9
Show file tree
Hide file tree
Showing 38 changed files with 1,510 additions and 98 deletions.
1 change: 1 addition & 0 deletions docs/api_refs/blacklisted-entrypoints.json
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
"../../langchain/src/tools/connery.ts",
"../../langchain/src/tools/gmail.ts",
"../../langchain/src/tools/google_places.ts",
"../../langchain/src/tools/google_trends.ts",
"../../langchain/src/embeddings/bedrock.ts",
"../../langchain/src/embeddings/cloudflare_workersai.ts",
"../../langchain/src/embeddings/ollama.ts",
Expand Down
141 changes: 138 additions & 3 deletions docs/core_docs/docs/integrations/chat/google_vertex_ai.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,8 @@
"source": [
"# ChatVertexAI\n",
"\n",
"[Google Vertex](https://cloud.google.com/vertex-ai) is a service that exposes all foundation models available in Google Cloud, like `gemini-1.5-pro`, `gemini-1.5-flash`, etc.",
"It also provides some non-Google models such as [Anthropic's Claude](https://cloud.google.com/vertex-ai/generative-ai/docs/partner-models/use-claude).",
"[Google Vertex](https://cloud.google.com/vertex-ai) is a service that exposes all foundation models available in Google Cloud, like `gemini-1.5-pro`, `gemini-2.0-flash-exp`, etc.\n",
"It also provides some non-Google models such as [Anthropic's Claude](https://cloud.google.com/vertex-ai/generative-ai/docs/partner-models/use-claude).\n",
"\n",
"\n",
"This will help you getting started with `ChatVertexAI` [chat models](/docs/concepts/chat_models). For detailed documentation of all `ChatVertexAI` features and configurations head to the [API reference](https://api.js.langchain.com/classes/langchain_google_vertexai.ChatVertexAI.html).\n",
Expand Down Expand Up @@ -116,7 +116,7 @@
"// import { ChatVertexAI } from \"@langchain/google-vertexai-web\"\n",
"\n",
"const llm = new ChatVertexAI({\n",
" model: \"gemini-1.5-pro\",\n",
" model: \"gemini-2.0-flash-exp\",\n",
" temperature: 0,\n",
" maxRetries: 2,\n",
" // For web, authOptions.credentials\n",
Expand Down Expand Up @@ -191,6 +191,141 @@
"console.log(aiMsg.content)"
]
},
{
"cell_type": "markdown",
"id": "de2480fa",
"metadata": {},
"source": [
"## Tool Calling with Google Search Retrieval\n",
"\n",
"It is possible to call the model with a Google search tool which you can use to [ground](https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/grounding) content generation with real-world information and reduce hallucinations.\n",
"\n",
"Grounding is currently not supported by `gemini-2.0-flash-exp`.\n",
"\n",
"You can choose to either ground using Google Search or by using a custom data store. Here are examples of both: "
]
},
{
"cell_type": "markdown",
"id": "fd2091ba",
"metadata": {},
"source": [
"### Google Search Retrieval\n",
"\n",
"Grounding example that uses Google Search:\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "65d019ee",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"The Boston Celtics won the 2024 NBA Finals, defeating the Dallas Mavericks 4-1 in the series to claim their 18th NBA championship. This victory marked their first title since 2008 and established them as the team with the most NBA championships, surpassing the Los Angeles Lakers' 17 titles.\n",
"\n"
]
}
],
"source": [
"import { ChatVertexAI } from \"@langchain/google-vertexai\"\n",
"\n",
"const searchRetrievalTool = {\n",
" googleSearchRetrieval: {\n",
" dynamicRetrievalConfig: {\n",
" mode: \"MODE_DYNAMIC\", // Use Dynamic Retrieval\n",
" dynamicThreshold: 0.7, // Default for Dynamic Retrieval threshold\n",
" },\n",
" },\n",
"};\n",
"\n",
"const searchRetrievalModel = new ChatVertexAI({\n",
" model: \"gemini-1.5-pro\",\n",
" temperature: 0,\n",
" maxRetries: 0,\n",
"}).bindTools([searchRetrievalTool]);\n",
"\n",
"const searchRetrievalResult = await searchRetrievalModel.invoke(\"Who won the 2024 NBA Finals?\");\n",
"\n",
"console.log(searchRetrievalResult.content);"
]
},
{
"cell_type": "markdown",
"id": "ac3a4a98",
"metadata": {},
"source": [
"### Google Search Retrieval with Data Store\n",
"\n",
"First, set up your data store (this is a schema of an example data store):\n",
"\n",
"| ID | Date | Team 1 | Score | Team 2 |\n",
"|:-------:|:------------:|:-----------:|:--------:|:----------:|\n",
"| 3001 | 2023-09-07 | Argentina | 1 - 0 | Ecuador |\n",
"| 3002 | 2023-09-12 | Venezuela | 1 - 0 | Paraguay |\n",
"| 3003 | 2023-09-12 | Chile | 0 - 0 | Colombia |\n",
"| 3004 | 2023-09-12 | Peru | 0 - 1 | Brazil |\n",
"| 3005 | 2024-10-15 | Argentina | 6 - 0 | Bolivia |\n",
"\n",
"Then, use this data store in the example provided below:\n",
"\n",
"(Note that you have to use your own variables for `projectId` and `datastoreId`)\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "a6a539d9",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Argentina won against Bolivia with a score of 6-0 on October 15, 2024.\n",
"\n"
]
}
],
"source": [
"import { ChatVertexAI } from \"@langchain/google-vertexai\";\n",
"\n",
"const projectId = \"YOUR_PROJECT_ID\";\n",
"const datastoreId = \"YOUR_DATASTORE_ID\";\n",
"\n",
"const searchRetrievalToolWithDataset = {\n",
" retrieval: {\n",
" vertexAiSearch: {\n",
" datastore: `projects/${projectId}/locations/global/collections/default_collection/dataStores/${datastoreId}`,\n",
" },\n",
" disableAttribution: false,\n",
" },\n",
"};\n",
"\n",
"const searchRetrievalModelWithDataset = new ChatVertexAI({\n",
" model: \"gemini-1.5-pro\",\n",
" temperature: 0,\n",
" maxRetries: 0,\n",
"}).bindTools([searchRetrievalToolWithDataset]);\n",
"\n",
"const searchRetrievalModelResult = await searchRetrievalModelWithDataset.invoke(\n",
" \"What is the score of Argentina vs Bolivia football game?\"\n",
");\n",
"\n",
"console.log(searchRetrievalModelResult.content);"
]
},
{
"cell_type": "markdown",
"id": "8d11f2be",
"metadata": {},
"source": [
"You should now get results that are grounded in the data from your provided data store."
]
},
{
"cell_type": "markdown",
"id": "18e2bfc0-7e78-4528-a73f-499ac150dca8",
Expand Down
8 changes: 8 additions & 0 deletions docs/core_docs/docs/integrations/chat/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,14 @@ hide_table_of_contents: true
If you'd like to write your own chat model, see [this how-to](/docs/how_to/custom_chat). If you'd like to contribute an integration, see [Contributing integrations](/docs/contributing).
:::

import ChatModelTabs from "@theme/ChatModelTabs";

<ChatModelTabs openaiParams={`{ model: "gpt-4o-mini" }`} />

```python
await model.invoke("Hello, world!")
```

## Featured providers

| Model | Stream | JSON mode | [Tool Calling](/docs/how_to/tool_calling/) | [`withStructuredOutput()`](/docs/how_to/structured_output/#the-.withstructuredoutput-method) | [Multimodal](/docs/how_to/multimodal_inputs/) |
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/integrations/memory/file.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@ hide_table_of_contents: true

import CodeBlock from "@theme/CodeBlock";

# File Chat Message History
# File System Chat Message History

The `FileChatMessageHistory` uses a JSON file to store chat message history. For longer-term persistence across chat sessions, you can swap out the default in-memory `chatHistory` that backs chat memory classes like `BufferMemory`.
The `FileSystemChatMessageHistory` uses a JSON file to store chat message history. For longer-term persistence across chat sessions, you can swap out the default in-memory `chatHistory` that backs chat memory classes like `BufferMemory`.

## Setup

Expand Down
8 changes: 8 additions & 0 deletions docs/core_docs/docs/integrations/text_embedding/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,14 @@ sidebar_class_name: hidden

This page documents integrations with various model providers that allow you to use embeddings in LangChain.

import EmbeddingTabs from "@theme/EmbeddingTabs";

<EmbeddingTabs />

```javascript
await embeddings.embedQuery("Hello, world!");
```

import { CategoryTable, IndexTable } from "@theme/FeatureTables";

<IndexTable />
185 changes: 185 additions & 0 deletions docs/core_docs/docs/integrations/tools/google_scholar.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,185 @@
{
"cells": [
{
"cell_type": "raw",
"id": "10238e62-3465-4973-9279-606cbb7ccf16",
"metadata": {
"vscode": {
"languageId": "raw"
}
},
"source": [
"---\n",
"sidebar_label: Google Scholar\n",
"---"
]
},
{
"cell_type": "markdown",
"id": "a6f91f20",
"metadata": {},
"source": [
"# Google Scholar Tool\n",
"\n",
"This notebook provides a quick overview for getting started with [`SERPGoogleScholarTool`](https://api.js.langchain.com/classes/_langchain_community.tools_google_scholar.SERPGoogleScholarAPITool.html). For detailed documentation of all `SERPGoogleScholarAPITool` features and configurations, head to the [API reference](https://api.js.langchain.com/classes/_langchain_community.tools_google_scholar.SERPGoogleScholarAPITool.html).\n",
"\n",
"## Overview\n",
"\n",
"### Integration details\n",
"\n",
"| Class | Package | [PY support](https://python.langchain.com/docs/integrations/tools/google_scholar/) | Package latest |\n",
"| :--- | :--- | :---: | :---: |\n",
"| [GoogleScholarTool](https://api.js.langchain.com/classes/_langchain_community.tools_google_scholar.SERPGoogleScholarAPITool.html) | [@langchain/community](https://www.npmjs.com/package/@langchain/community) | ✅ | ![NPM - Version](https://img.shields.io/npm/v/@langchain/community?style=flat-square&label=%20&) |\n",
"\n",
"### Tool features\n",
"\n",
"- Retrieve academic publications by topic, author, or query.\n",
"- Fetch metadata such as title, author, and publication year.\n",
"- Advanced search filters, including citation count and journal name.\n",
"\n",
"## Setup\n",
"\n",
"The integration lives in the `@langchain/community` package.\n",
"\n",
"```bash\n",
"npm install @langchain/community\n",
"```\n",
"\n",
"### Credentials\n",
"\n",
"Ensure you have the appropriate API key to access Google Scholar. Set it in your environment variables:\n",
"\n",
"```typescript\n",
"process.env.GOOGLE_SCHOLAR_API_KEY=\"your-serp-api-key\"\n",
"```\n",
"\n",
"It's also helpful to set up [LangSmith](https://smith.langchain.com/) for best-in-class observability:\n",
"\n",
"```typescript\n",
"process.env.LANGCHAIN_TRACING_V2=\"true\"\n",
"process.env.LANGCHAIN_API_KEY=\"your-langchain-api-key\"\n",
"```"
]
},
{
"cell_type": "markdown",
"id": "1c97218f-f366-479d-8bf7-fe9f2f6df73f",
"metadata": {},
"source": [
"## Instantiation\n",
"\n",
"You can import and instantiate an instance of the `SERPGoogleScholarAPITool` tool like this:"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "8b3ddfe9-ca79-494c-a7ab-1f56d9407a64",
"metadata": {
"vscode": {
"languageId": "typescript"
}
},
"outputs": [],
"source": [
"import { SERPGoogleScholarAPITool } from \"@langchain/community/tools/google_scholar\";\n",
"\n",
"const tool = new SERPGoogleScholarAPITool({\n",
" apiKey: process.env.SERPAPI_API_KEY,\n",
"});"
]
},
{
"cell_type": "markdown",
"id": "74147a1a",
"metadata": {},
"source": [
"## Invocation\n",
"\n",
"### Invoke directly with args\n",
"\n",
"You can invoke the tool directly with query arguments:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "65310a8b-eb0c-4d9e-a618-4f4abe2414fc",
"metadata": {
"vscode": {
"languageId": "typescript"
}
},
"outputs": [],
"source": [
"const results = await tool.invoke({\n",
" query: \"neural networks\",\n",
" maxResults: 5,\n",
"});\n",
"\n",
"console.log(results);"
]
},
{
"cell_type": "markdown",
"id": "d6e73897",
"metadata": {},
"source": [
"### Invoke with ToolCall\n",
"\n",
"We can also invoke the tool with a model-generated `ToolCall`:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "f90e33a7",
"metadata": {
"vscode": {
"languageId": "typescript"
}
},
"outputs": [],
"source": [
"const modelGeneratedToolCall = {\n",
" args: { query: \"machine learning\" },\n",
" id: \"1\",\n",
" name: tool.name,\n",
" type: \"tool_call\",\n",
"};\n",
"await tool.invoke(modelGeneratedToolCall);"
]
},
{
"cell_type": "markdown",
"id": "93848b02",
"metadata": {},
"source": [
"## API reference\n",
"\n",
"For detailed documentation of all `SERPGoogleScholarAPITool` features and configurations, head to the [API reference](https://api.js.langchain.com/classes/_langchain_community.tools_google_scholar.SERPGoogleScholarAPITool.html)."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "poetry-venv-311",
"language": "python",
"name": "poetry-venv-311"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.9"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
Loading

0 comments on commit 08a70d9

Please sign in to comment.