Skip to content

Commit

Permalink
docs[minor]: Add chat model tabs to docs where appropriate (#4844)
Browse files Browse the repository at this point in the history
* docs[minor]: Add chat model tabs to docs where appropriate

* cr

* add more

* chore: lint files

* chore: lint files
  • Loading branch information
bracesproul authored Mar 25, 2024
1 parent e3319d9 commit 77b73a1
Show file tree
Hide file tree
Showing 14 changed files with 267 additions and 186 deletions.
16 changes: 8 additions & 8 deletions docs/core_docs/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,12 @@ docs/use_cases/query_analysis/quickstart.md
docs/use_cases/query_analysis/quickstart.mdx
docs/use_cases/query_analysis/index.md
docs/use_cases/query_analysis/index.mdx
docs/use_cases/extraction/quickstart.md
docs/use_cases/extraction/quickstart.mdx
docs/use_cases/extraction/index.md
docs/use_cases/extraction/index.mdx
docs/use_cases/extraction/guidelines.md
docs/use_cases/extraction/guidelines.mdx
docs/use_cases/graph/semantic.md
docs/use_cases/graph/semantic.mdx
docs/use_cases/graph/quickstart.md
Expand All @@ -67,12 +73,6 @@ docs/use_cases/graph/mapping.md
docs/use_cases/graph/mapping.mdx
docs/use_cases/graph/index.md
docs/use_cases/graph/index.mdx
docs/use_cases/extraction/quickstart.md
docs/use_cases/extraction/quickstart.mdx
docs/use_cases/extraction/index.md
docs/use_cases/extraction/index.mdx
docs/use_cases/extraction/guidelines.md
docs/use_cases/extraction/guidelines.mdx
docs/use_cases/query_analysis/techniques/structuring.md
docs/use_cases/query_analysis/techniques/structuring.mdx
docs/use_cases/query_analysis/techniques/step_back.md
Expand Down Expand Up @@ -105,12 +105,12 @@ docs/use_cases/extraction/how_to/handle_files.md
docs/use_cases/extraction/how_to/handle_files.mdx
docs/use_cases/extraction/how_to/examples.md
docs/use_cases/extraction/how_to/examples.mdx
docs/modules/memory/chat_messages/custom.md
docs/modules/memory/chat_messages/custom.mdx
docs/modules/model_io/output_parsers/custom.md
docs/modules/model_io/output_parsers/custom.mdx
docs/modules/model_io/chat/function_calling.md
docs/modules/model_io/chat/function_calling.mdx
docs/modules/memory/chat_messages/custom.md
docs/modules/memory/chat_messages/custom.mdx
docs/modules/data_connection/vectorstores/custom.md
docs/modules/data_connection/vectorstores/custom.mdx
docs/modules/model_io/output_parsers/types/openai_tools.md
Expand Down
15 changes: 11 additions & 4 deletions docs/core_docs/docs/expression_language/streaming.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,17 @@
"import \"dotenv/config\";"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"```{=mdx}\n",
"import ChatModelTabs from \"@theme/ChatModelTabs\";\n",
"\n",
"<ChatModelTabs />\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 2,
Expand Down Expand Up @@ -172,10 +183,6 @@
}
],
"source": [
"import { ChatOpenAI } from \"@langchain/openai\";\n",
"\n",
"const model = new ChatOpenAI({});\n",
"\n",
"const stream = await model.stream(\"Hello! Tell me about yourself.\");\n",
"const chunks = [];\n",
"for await (const chunk of stream) {\n",
Expand Down
25 changes: 14 additions & 11 deletions docs/core_docs/docs/use_cases/query_analysis/how_to/few_shot.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -37,17 +37,13 @@
"<IntegrationInstallTooltip></IntegrationInstallTooltip>\n",
"\n",
"<Npm2Yarn>\n",
" @langchain/core @langchain/openai zod uuid\n",
" @langchain/core zod uuid\n",
"</Npm2Yarn>\n",
"```\n",
"\n",
"#### Set environment variables\n",
"\n",
"We'll use OpenAI in this example:\n",
"\n",
"```\n",
"OPENAI_API_KEY=your-api-key\n",
"\n",
"# Optional, use LangSmith for best-in-class observability\n",
"LANGSMITH_API_KEY=your-api-key\n",
"LANGCHAIN_TRACING_V2=true\n",
Expand Down Expand Up @@ -96,6 +92,18 @@
"## Query generation"
]
},
{
"cell_type": "markdown",
"id": "86f5ee5d",
"metadata": {},
"source": [
"```{=mdx}\n",
"import ChatModelTabs from \"@theme/ChatModelTabs\";\n",
"\n",
"<ChatModelTabs customVarName=\"llm\" />\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 33,
Expand All @@ -105,7 +113,6 @@
"source": [
"import { ChatPromptTemplate, MessagesPlaceholder } from \"@langchain/core/prompts\"\n",
"import { RunnablePassthrough, RunnableSequence } from \"@langchain/core/runnables\"\n",
"import { ChatOpenAI } from \"@langchain/openai\"\n",
"\n",
"const system = `You are an expert at converting user questions into database queries.\n",
"You have access to a database of tutorial videos about a software library for building LLM-powered applications.\n",
Expand All @@ -123,10 +130,6 @@
" [\"human\", \"{question}\"],\n",
"]\n",
")\n",
"const llm = new ChatOpenAI({\n",
" modelName: \"gpt-3.5-turbo-0125\",\n",
" temperature: 0\n",
"});\n",
"const llmWithTools = llm.withStructuredOutput(searchSchema, {\n",
" name: \"Search\",\n",
"})\n",
Expand Down Expand Up @@ -288,7 +291,7 @@
"id": "bd21389c-f862-44e6-9d51-92db10979525",
"metadata": {},
"source": [
"Now we need to update our prompt template and chain so that the examples are included in each prompt. Since we're working with OpenAI function-calling, we'll need to do a bit of extra structuring to send example inputs and outputs to the model. We'll create a `toolExampleToMessages` helper function to handle this for us:"
"Now we need to update our prompt template and chain so that the examples are included in each prompt. Since we're working with LLM model function-calling, we'll need to do a bit of extra structuring to send example inputs and outputs to the model. We'll create a `toolExampleToMessages` helper function to handle this for us:"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,17 +37,13 @@
"<IntegrationInstallTooltip></IntegrationInstallTooltip>\n",
"\n",
"<Npm2Yarn>\n",
" @langchain/core @langchain/community @langchain/openai zod chromadb @faker-js/faker\n",
" @langchain/core @langchain/community zod chromadb @faker-js/faker\n",
"</Npm2Yarn>\n",
"```\n",
"\n",
"#### Set environment variables\n",
"\n",
"We'll use OpenAI in this example:\n",
"\n",
"```\n",
"OPENAI_API_KEY=your-api-key\n",
"\n",
"# Optional, use LangSmith for best-in-class observability\n",
"LANGSMITH_API_KEY=your-api-key\n",
"LANGCHAIN_TRACING_V2=true\n",
Expand Down Expand Up @@ -151,6 +147,18 @@
"})"
]
},
{
"cell_type": "markdown",
"id": "0c02d1b3",
"metadata": {},
"source": [
"```{=mdx}\n",
"import ChatModelTabs from \"@theme/ChatModelTabs\";\n",
"\n",
"<ChatModelTabs customVarName=\"llm\" />\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 49,
Expand All @@ -160,7 +168,6 @@
"source": [
"import { ChatPromptTemplate } from \"@langchain/core/prompts\";\n",
"import { RunnablePassthrough, RunnableSequence } from \"@langchain/core/runnables\";\n",
"import { ChatOpenAI } from \"@langchain/openai\";\n",
"\n",
"const system = `Generate a relevant search query for a library system`;\n",
"const prompt = ChatPromptTemplate.fromMessages(\n",
Expand All @@ -169,10 +176,6 @@
" [\"human\", \"{question}\"],\n",
" ]\n",
")\n",
"const llm = new ChatOpenAI({\n",
" modelName: \"gpt-3.5-turbo-0125\",\n",
" temperature: 0\n",
"});\n",
"const llmWithTools = llm.withStructuredOutput(searchSchema, {\n",
" name: \"Search\"\n",
"})\n",
Expand Down Expand Up @@ -371,17 +374,23 @@
"We can try to use a longer context window... but with so much information in there, it is not garunteed to pick it up reliably"
]
},
{
"cell_type": "markdown",
"id": "618a9762",
"metadata": {},
"source": [
"```{=mdx}\n",
"<ChatModelTabs customVarName=\"llmLong\" openaiParams={`{ modelName: \"gpt-4-turbo-preview\" }`} />\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 55,
"id": "0f0d0757",
"metadata": {},
"outputs": [],
"source": [
"const llmLong = new ChatOpenAI({\n",
" modelName: \"gpt-4-turbo-preview\",\n",
" temperature: 0,\n",
"})\n",
"const structuredLlmLong = llmLong.withStructuredOutput(searchSchema, {\n",
" name: \"Search\"\n",
"});\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,8 +41,6 @@
"\n",
"#### Set environment variables\n",
"\n",
"We'll use OpenAI in this example:\n",
"\n",
"```\n",
"OPENAI_API_KEY=your-api-key\n",
"\n",
Expand Down Expand Up @@ -139,6 +137,18 @@
"}).describe(\"Search over a database of job records.\");"
]
},
{
"cell_type": "markdown",
"id": "013a5041",
"metadata": {},
"source": [
"```{=mdx}\n",
"import ChatModelTabs from \"@theme/ChatModelTabs\";\n",
"\n",
"<ChatModelTabs customVarName=\"llm\" />\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 4,
Expand All @@ -148,7 +158,6 @@
"source": [
"import { ChatPromptTemplate } from \"@langchain/core/prompts\";\n",
"import { RunnableSequence, RunnablePassthrough } from \"@langchain/core/runnables\";\n",
"import { ChatOpenAI } from \"@langchain/openai\";\n",
"\n",
"const system = `You have the ability to issue search queries to get information to help answer user information.\n",
"\n",
Expand All @@ -158,10 +167,6 @@
" [\"system\", system],\n",
" [\"human\", \"{question}\"],\n",
"])\n",
"const llm = new ChatOpenAI({\n",
" modelName: \"gpt-3.5-turbo-0125\",\n",
" temperature: 0,\n",
"});\n",
"const llmWithTools = llm.withStructuredOutput(searchSchema, {\n",
" name: \"Search\"\n",
"});\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,8 +41,6 @@
"\n",
"#### Set environment variables\n",
"\n",
"We'll use OpenAI in this example:\n",
"\n",
"```\n",
"OPENAI_API_KEY=your-api-key\n",
"\n",
Expand Down Expand Up @@ -150,6 +148,18 @@
"})"
]
},
{
"cell_type": "markdown",
"id": "a3c79210",
"metadata": {},
"source": [
"```{=mdx}\n",
"import ChatModelTabs from \"@theme/ChatModelTabs\";\n",
"\n",
"<ChatModelTabs customVarName=\"llm\" />\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 6,
Expand All @@ -159,7 +169,6 @@
"source": [
"import { ChatPromptTemplate } from \"@langchain/core/prompts\";\n",
"import { RunnableSequence, RunnablePassthrough } from \"@langchain/core/runnables\";\n",
"import { ChatOpenAI } from \"@langchain/openai\";\n",
"\n",
"const system = `You have the ability to issue search queries to get information to help answer user information.`\n",
"const prompt = ChatPromptTemplate.fromMessages(\n",
Expand All @@ -168,10 +177,6 @@
" [\"human\", \"{question}\"],\n",
"]\n",
")\n",
"const llm = new ChatOpenAI({\n",
"modelName: \"gpt-3.5-turbo-0125\",\n",
"temperature: 0\n",
"});\n",
"const llmWithTools = llm.withStructuredOutput(searchSchema, {\n",
"name: \"Search\"\n",
"})\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,6 @@
"\n",
"#### Set environment variables\n",
"\n",
"We'll use OpenAI in this example:\n",
"\n",
"```\n",
"OPENAI_API_KEY=your-api-key\n",
"\n",
Expand Down Expand Up @@ -136,6 +134,18 @@
"});"
]
},
{
"cell_type": "markdown",
"id": "b7916d00",
"metadata": {},
"source": [
"```{=mdx}\n",
"import ChatModelTabs from \"@theme/ChatModelTabs\";\n",
"\n",
"<ChatModelTabs customVarName=\"llm\" />\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 25,
Expand All @@ -146,7 +156,6 @@
"import { zodToJsonSchema } from \"zod-to-json-schema\";\n",
"import { ChatPromptTemplate } from \"@langchain/core/prompts\";\n",
"import { RunnableSequence, RunnablePassthrough } from \"@langchain/core/runnables\";\n",
"import { ChatOpenAI } from \"@langchain/openai\";\n",
"\n",
"const system = `You have the ability to issue search queries to get information to help answer user information.\n",
"\n",
Expand All @@ -157,10 +166,6 @@
" [\"human\", \"{question}\"],\n",
" ]\n",
")\n",
"const llm = new ChatOpenAI({\n",
" modelName: \"gpt-3.5-turbo-0125\",\n",
" temperature: 0\n",
"});\n",
"const llmWithTools = llm.bind({\n",
" tools: [{\n",
" type: \"function\" as const,\n",
Expand Down
Loading

0 comments on commit 77b73a1

Please sign in to comment.