diff --git a/docs/core_docs/docs/integrations/chat/togetherai.ipynb b/docs/core_docs/docs/integrations/chat/togetherai.ipynb
new file mode 100644
index 000000000000..1b28aa9f2582
--- /dev/null
+++ b/docs/core_docs/docs/integrations/chat/togetherai.ipynb
@@ -0,0 +1,343 @@
+{
+ "cells": [
+ {
+ "cell_type": "raw",
+ "id": "afaf8039",
+ "metadata": {
+ "vscode": {
+ "languageId": "raw"
+ }
+ },
+ "source": [
+ "---\n",
+ "sidebar_label: Together\n",
+ "---"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "e49f1e0d",
+ "metadata": {},
+ "source": [
+ "# ChatTogetherAI\n",
+ "\n",
+ "This will help you getting started with `ChatTogetherAI` [chat models](/docs/concepts/#chat-models). For detailed documentation of all `ChatTogetherAI` features and configurations head to the [API reference](https://api.js.langchain.com/classes/langchain_community_chat_models_togetherai.ChatTogetherAI.html).\n",
+ "\n",
+ "## Overview\n",
+ "### Integration details\n",
+ "\n",
+ "| Class | Package | Local | Serializable | [PY support](https://python.langchain.com/v0.2/docs/integrations/chat/togetherai) | Package downloads | Package latest |\n",
+ "| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
+ "| [ChatTogetherAI](https://api.js.langchain.com/classes/langchain_community_chat_models_togetherai.ChatTogetherAI.html) | [@langchain/community](https://api.js.langchain.com/modules/langchain_community_chat_models_togetherai.html) | ❌ | ✅ | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/community?style=flat-square&label=%20&) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/community?style=flat-square&label=%20&) |\n",
+ "\n",
+ "### Model features\n",
+ "| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
+ "| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
+ "| ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | \n",
+ "\n",
+ "## Setup\n",
+ "\n",
+ "To access `ChatTogetherAI` models you'll need to create a Together account, get an API key [here](https://api.together.xyz/), and install the `@langchain/community` integration package.\n",
+ "\n",
+ "### Credentials\n",
+ "\n",
+ "Head to [api.together.ai](https://api.together.ai/) to sign up to TogetherAI and generate an API key. Once you've done this set the `TOGETHER_AI_API_KEY` environment variable:\n",
+ "\n",
+ "```bash\n",
+ "export TOGETHER_AI_API_KEY=\"your-api-key\"\n",
+ "```\n",
+ "\n",
+ "If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:\n",
+ "\n",
+ "```bash\n",
+ "# export LANGCHAIN_TRACING_V2=\"true\"\n",
+ "# export LANGCHAIN_API_KEY=\"your-api-key\"\n",
+ "```\n",
+ "\n",
+ "### Installation\n",
+ "\n",
+ "The LangChain ChatTogetherAI integration lives in the `@langchain/community` package:\n",
+ "\n",
+ "```{=mdx}\n",
+ "import IntegrationInstallTooltip from \"@mdx_components/integration_install_tooltip.mdx\";\n",
+ "import Npm2Yarn from \"@theme/Npm2Yarn\";\n",
+ "\n",
+ "\n",
+ "\n",
+ "\n",
+ " @langchain/community\n",
+ "\n",
+ "\n",
+ "```"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "a38cde65-254d-4219-a441-068766c0d4b5",
+ "metadata": {},
+ "source": [
+ "## Instantiation\n",
+ "\n",
+ "Now we can instantiate our model object and generate chat completions:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 1,
+ "id": "cb09c344-1836-4e0c-acf8-11d13ac1dbae",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import { ChatTogetherAI } from \"@langchain/community/chat_models/togetherai\"\n",
+ "\n",
+ "const llm = new ChatTogetherAI({\n",
+ " model: \"mistralai/Mixtral-8x7B-Instruct-v0.1\",\n",
+ " temperature: 0,\n",
+ " maxTokens: undefined,\n",
+ " timeout: undefined,\n",
+ " maxRetries: 2,\n",
+ " // other params...\n",
+ "})"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "2b4f3e15",
+ "metadata": {},
+ "source": [
+ "## Invocation"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 2,
+ "id": "62e0dbc3",
+ "metadata": {
+ "tags": []
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "AIMessage {\n",
+ " \"id\": \"chatcmpl-9rT9qEDPZ6iLCk6jt3XTzVDDH6pcI\",\n",
+ " \"content\": \"J'adore la programmation.\",\n",
+ " \"additional_kwargs\": {},\n",
+ " \"response_metadata\": {\n",
+ " \"tokenUsage\": {\n",
+ " \"completionTokens\": 8,\n",
+ " \"promptTokens\": 31,\n",
+ " \"totalTokens\": 39\n",
+ " },\n",
+ " \"finish_reason\": \"stop\"\n",
+ " },\n",
+ " \"tool_calls\": [],\n",
+ " \"invalid_tool_calls\": [],\n",
+ " \"usage_metadata\": {\n",
+ " \"input_tokens\": 31,\n",
+ " \"output_tokens\": 8,\n",
+ " \"total_tokens\": 39\n",
+ " }\n",
+ "}\n"
+ ]
+ }
+ ],
+ "source": [
+ "const aiMsg = await llm.invoke([\n",
+ " [\n",
+ " \"system\",\n",
+ " \"You are a helpful assistant that translates English to French. Translate the user sentence.\",\n",
+ " ],\n",
+ " [\"human\", \"I love programming.\"],\n",
+ "])\n",
+ "aiMsg"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 3,
+ "id": "d86145b3-bfef-46e8-b227-4dda5c9c2705",
+ "metadata": {},
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "J'adore la programmation.\n"
+ ]
+ }
+ ],
+ "source": [
+ "console.log(aiMsg.content)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "18e2bfc0-7e78-4528-a73f-499ac150dca8",
+ "metadata": {},
+ "source": [
+ "## Chaining\n",
+ "\n",
+ "We can [chain](/docs/how_to/sequence/) our model with a prompt template like so:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 4,
+ "id": "e197d1d7-a070-4c96-9f8a-a0e86d046e0b",
+ "metadata": {},
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "AIMessage {\n",
+ " \"id\": \"chatcmpl-9rT9wolZWfJ3xovORxnkdf1rcPbbY\",\n",
+ " \"content\": \"Ich liebe das Programmieren.\",\n",
+ " \"additional_kwargs\": {},\n",
+ " \"response_metadata\": {\n",
+ " \"tokenUsage\": {\n",
+ " \"completionTokens\": 6,\n",
+ " \"promptTokens\": 26,\n",
+ " \"totalTokens\": 32\n",
+ " },\n",
+ " \"finish_reason\": \"stop\"\n",
+ " },\n",
+ " \"tool_calls\": [],\n",
+ " \"invalid_tool_calls\": [],\n",
+ " \"usage_metadata\": {\n",
+ " \"input_tokens\": 26,\n",
+ " \"output_tokens\": 6,\n",
+ " \"total_tokens\": 32\n",
+ " }\n",
+ "}\n"
+ ]
+ }
+ ],
+ "source": [
+ "import { ChatPromptTemplate } from \"@langchain/core/prompts\"\n",
+ "\n",
+ "const prompt = ChatPromptTemplate.fromMessages(\n",
+ " [\n",
+ " [\n",
+ " \"system\",\n",
+ " \"You are a helpful assistant that translates {input_language} to {output_language}.\",\n",
+ " ],\n",
+ " [\"human\", \"{input}\"],\n",
+ " ]\n",
+ ")\n",
+ "\n",
+ "const chain = prompt.pipe(llm);\n",
+ "await chain.invoke(\n",
+ " {\n",
+ " input_language: \"English\",\n",
+ " output_language: \"German\",\n",
+ " input: \"I love programming.\",\n",
+ " }\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "d1ee55bc-ffc8-4cfa-801c-993953a08cfd",
+ "metadata": {},
+ "source": [
+ "## Tool calling & JSON mode\n",
+ "\n",
+ "The TogetherAI chat supports JSON mode and calling tools.\n",
+ "\n",
+ "### Tool calling"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 5,
+ "id": "8de584a8",
+ "metadata": {},
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "[\n",
+ " {\n",
+ " name: 'calculator',\n",
+ " args: { input: '2 + 3' },\n",
+ " type: 'tool_call',\n",
+ " id: 'call_nhtnmganqJPAG9I1cN8ULI9R'\n",
+ " }\n",
+ "]\n"
+ ]
+ }
+ ],
+ "source": [
+ "import { ChatTogetherAI } from \"@langchain/community/chat_models/togetherai\";\n",
+ "import { ChatPromptTemplate } from \"@langchain/core/prompts\";\n",
+ "import { convertToOpenAITool } from \"@langchain/core/utils/function_calling\";\n",
+ "import { Calculator } from \"@langchain/community/tools/calculator\";\n",
+ "\n",
+ "// Use a pre-built tool\n",
+ "const calculatorTool = convertToOpenAITool(new Calculator());\n",
+ "\n",
+ "const modelWithCalculator = new ChatTogetherAI({\n",
+ " temperature: 0,\n",
+ " // This is the default env variable name it will look for if none is passed.\n",
+ " apiKey: process.env.TOGETHER_AI_API_KEY,\n",
+ " // Together JSON mode/tool calling only supports a select number of models\n",
+ " model: \"mistralai/Mixtral-8x7B-Instruct-v0.1\",\n",
+ "}).bind({\n",
+ " // Bind the tool to the model.\n",
+ " tools: [calculatorTool],\n",
+ " tool_choice: calculatorTool, // Specify what tool the model should use\n",
+ "});\n",
+ "\n",
+ "const promptForTools = ChatPromptTemplate.fromMessages([\n",
+ " [\"system\", \"You are a super not-so-smart mathmatician.\"],\n",
+ " [\"human\", \"Help me out, how can I add {math}?\"],\n",
+ "]);\n",
+ "\n",
+ "// Use LCEL to chain the prompt to the model.\n",
+ "const responseWithTool = await promptForTools.pipe(modelWithCalculator).invoke({\n",
+ " math: \"2 plus 3\",\n",
+ "});\n",
+ "\n",
+ "console.dir(responseWithTool.tool_calls, { depth: null });"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "3a5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
+ "metadata": {},
+ "source": [
+ "Behind the scenes, TogetherAI uses the OpenAI SDK and OpenAI compatible API, with some caveats:\n",
+ "\n",
+ "- Certain properties are not supported by the TogetherAI API, see [here](https://docs.together.ai/reference/chat-completions).\n",
+ "\n",
+ "## API reference\n",
+ "\n",
+ "For detailed documentation of all ChatTogetherAI features and configurations head to the API reference: https://api.js.langchain.com/classes/langchain_community_chat_models_togetherai.ChatTogetherAI.html"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "TypeScript",
+ "language": "typescript",
+ "name": "tslab"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "mode": "typescript",
+ "name": "javascript",
+ "typescript": true
+ },
+ "file_extension": ".ts",
+ "mimetype": "text/typescript",
+ "name": "typescript",
+ "version": "3.7.2"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 5
+}
diff --git a/docs/core_docs/docs/integrations/chat/togetherai.mdx b/docs/core_docs/docs/integrations/chat/togetherai.mdx
deleted file mode 100644
index f938be05c8c4..000000000000
--- a/docs/core_docs/docs/integrations/chat/togetherai.mdx
+++ /dev/null
@@ -1,65 +0,0 @@
----
-sidebar_label: TogetherAI
----
-
-import CodeBlock from "@theme/CodeBlock";
-
-# ChatTogetherAI
-
-## Setup
-
-1. Create a TogetherAI account and get your API key [here](https://api.together.xyz/).
-2. Export or set your API key inline. The ChatTogetherAI class defaults to `process.env.TOGETHER_AI_API_KEY`.
-
-```bash
-export TOGETHER_AI_API_KEY=your-api-key
-```
-
-You can use models provided by TogetherAI as follows:
-
-import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
-
-
-
-```bash npm2yarn
-npm install @langchain/community
-```
-
-import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";
-
-
-
-import TogetherAI from "@examples/models/chat/integration_togetherai.ts";
-
-{TogetherAI}
-
-## Tool calling & JSON mode
-
-The TogetherAI chat supports JSON mode and calling tools.
-
-### Tool calling
-
-import TogetherToolsExample from "@examples/models/chat/integration_togetherai_tools.ts";
-
-{TogetherToolsExample}
-
-:::tip
-See a LangSmith trace of the above example [here](https://smith.langchain.com/public/5082ea20-c2de-410f-80e2-dbdfbf4d8adb/r).
-:::
-
-### JSON mode
-
-To use JSON mode you must include the string "JSON" inside the prompt.
-Typical conventions include telling the model to use JSON, eg: `Respond to the user in JSON format`.
-
-import TogetherJSONModeExample from "@examples/models/chat/integration_togetherai_json.ts";
-
-{TogetherJSONModeExample}
-
-:::tip
-See a LangSmith trace of the above example [here](https://smith.langchain.com/public/3864aebb-5096-4b5f-b096-e54ddd1ec3d2/r).
-:::
-
-Behind the scenes, TogetherAI uses the OpenAI SDK and OpenAI compatible API, with some caveats:
-
-- Certain properties are not supported by the TogetherAI API, see [here](https://docs.together.ai/reference/chat-completions).