Skip to content

Commit

Permalink
docs: custom callback handlers page (#20494)
Browse files Browse the repository at this point in the history
**Description:** Update to the Callbacks page on custom callback
handlers
**Issue:** #20493 
**Dependencies:** None

---------

Co-authored-by: Bagatur <[email protected]>
Co-authored-by: Bagatur <[email protected]>
  • Loading branch information
3 people authored Apr 25, 2024
1 parent 5da9dd1 commit 9e69496
Showing 1 changed file with 32 additions and 39 deletions.
71 changes: 32 additions & 39 deletions docs/docs/modules/callbacks/custom_callbacks.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,22 @@
"source": [
"# Custom callback handlers\n",
"\n",
"You can create a custom handler to set on the object as well. In the example below, we'll implement streaming with a custom handler."
"To create a custom callback handler we need to determine the [event(s)](/docs/modules/callbacks/) we want our callback handler to handle as well as what we want our callback handler to do when the event is triggered. Then all we need to do is attach the callback handler to the object either as a constructer callback or a request callback (see [callback types](/docs/modules/callbacks/))."
]
},
{
"cell_type": "markdown",
"id": "428d5e5f",
"metadata": {},
"source": [
"In the example below, we'll implement streaming with a custom handler.\n",
"\n",
"In our custom callback handler `MyCustomHandler`, we implement the `on_llm_new_token` to print the token we have just received. We then attach our custom handler to the model object as a constructor callback."
]
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": 5,
"id": "ed9e8756",
"metadata": {},
"outputs": [
Expand All @@ -22,38 +32,25 @@
"text": [
"My custom handler, token: \n",
"My custom handler, token: Why\n",
"My custom handler, token: don\n",
"My custom handler, token: 't\n",
"My custom handler, token: scientists\n",
"My custom handler, token: trust\n",
"My custom handler, token: atoms\n",
"My custom handler, token: do\n",
"My custom handler, token: bears\n",
"My custom handler, token: have\n",
"My custom handler, token: hairy\n",
"My custom handler, token: coats\n",
"My custom handler, token: ?\n",
"My custom handler, token: \n",
"\n",
"\n",
"My custom handler, token: Because\n",
"My custom handler, token: they\n",
"My custom handler, token: make\n",
"My custom handler, token: up\n",
"My custom handler, token: everything\n",
"My custom handler, token: .\n",
"My custom handler, token: F\n",
"My custom handler, token: ur\n",
"My custom handler, token: protection\n",
"My custom handler, token: !\n",
"My custom handler, token: \n"
]
},
{
"data": {
"text/plain": [
"AIMessage(content=\"Why don't scientists trust atoms? \\n\\nBecause they make up everything.\", additional_kwargs={}, example=False)"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from langchain_core.callbacks import BaseCallbackHandler\n",
"from langchain_core.messages import HumanMessage\n",
"from langchain_core.prompts import ChatPromptTemplate\n",
"from langchain_openai import ChatOpenAI\n",
"\n",
"\n",
Expand All @@ -62,27 +59,23 @@
" print(f\"My custom handler, token: {token}\")\n",
"\n",
"\n",
"prompt = ChatPromptTemplate.from_messages([\"Tell me a joke about {animal}\"])\n",
"\n",
"# To enable streaming, we pass in `streaming=True` to the ChatModel constructor\n",
"# Additionally, we pass in a list with our custom handler\n",
"chat = ChatOpenAI(max_tokens=25, streaming=True, callbacks=[MyCustomHandler()])\n",
"# Additionally, we pass in our custom handler as a list to the callbacks parameter\n",
"model = ChatOpenAI(streaming=True, callbacks=[MyCustomHandler()])\n",
"\n",
"chat.invoke([HumanMessage(content=\"Tell me a joke\")])"
"chain = prompt | model\n",
"\n",
"response = chain.invoke({\"animal\": \"bears\"})"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "67ef5548",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "venv",
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "venv"
"name": "python3"
},
"language_info": {
"codemirror_mode": {
Expand All @@ -94,7 +87,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
"version": "3.9.1"
}
},
"nbformat": 4,
Expand Down

0 comments on commit 9e69496

Please sign in to comment.