Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: add example notebook for the JS SDK #1067

Merged
merged 17 commits into from
Dec 4, 2024
4 changes: 4 additions & 0 deletions cookbook/_routes.json
Original file line number Diff line number Diff line change
Expand Up @@ -154,5 +154,9 @@
{
"notebook": "example_query_data_via_sdk.ipynb",
"docsPath": null
},
{
"notebook": "js_langfuse_sdk.ipynb",
"docsPath": "docs/sdk/typescript/example-notebook"
}
]
361 changes: 361 additions & 0 deletions cookbook/js_langfuse_sdk.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,361 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "28e400f4",
"metadata": {},
"source": [
"---\n",
"description: Learn how to use the Langfuse JS/TS SDK to log any LLM.\n",
"category: Integrations\n",
"---\n",
"\n",
"# Cookbook: Langfuse JS/TS SDK\n",
"\n",
"JS/TS applications can either be traces via the [Langfuse SDK](https://langfuse.com/docs/sdk/typescript/guide) by wrapping any LLM model, or by using one of our native integrations such as [OpenAI](https://langfuse.com/docs/integrations/openai/js/get-started), [LangChain](https://langfuse.com/docs/integrations/langchain/example-javascript) or [Vercel AI SDK](https://langfuse.com/docs/integrations/vercel-ai-sdk). In this cookbook, we show you both methods to get you started.\n",
"\n",
"For this guide, we assume, that you are already familiar with the Langfuse data model (traces, spans, generations, etc.). If not, have a look [here](https://langfuse.com/docs/tracing#introduction-to-observability--traces-in-langfuse). "
]
},
{
"cell_type": "markdown",
"id": "849415ec",
"metadata": {},
"source": [
"## Step 1: Setup\n",
"\n",
"*Note: This cookbook uses Deno.js, which requires different syntax for importing packages and setting environment variables.*\n",
"\n",
"Set your Langfuse API keys, the Langfuse host name and keys for the used LLM providers."
]
},
{
"cell_type": "code",
"execution_count": 65,
"id": "ec3e7874-e6db-44cd-9d55-7ffa72f630fa",
"metadata": {},
"outputs": [],
"source": [
"// Set env variables, Deno-specific syntax\n",
"Deno.env.set(\"OPENAI_API_KEY\", \"sk-...\");\n",
"\n",
"Deno.env.set(\"ANTHROPIC_API_KEY\", \"sk-...\");\n",
"\n",
"Deno.env.set(\"LANGFUSE_SECRET_KEY\", \"sk-...\");\n",
"Deno.env.set(\"LANGFUSE_PUBLIC_KEY\", \"pk-...\");\n",
"Deno.env.set(\"LANGFUSE_HOST\", \"https://cloud.langfuse.com\") // For US data region, set this to \"https://us.cloud.langfuse.com\""
]
},
{
"cell_type": "markdown",
"id": "45b0d924",
"metadata": {},
"source": [
"Initialize the Langfuse client."
]
},
{
"cell_type": "code",
"execution_count": 66,
"id": "e2b8bb18",
"metadata": {},
"outputs": [],
"source": [
"import Langfuse from \"npm:langfuse\";\n",
"\n",
"// Init Langfuse SDK\n",
"const langfuse = new Langfuse();"
]
},
{
"cell_type": "markdown",
"id": "a9802f80",
"metadata": {},
"source": [
"## Step 2: Create a Trace\n",
"\n",
"Langfuse observability is structured around [traces](https://langfuse.com/docs/tracing#introduction-to-observability--traces-in-langfuse). Each trace can contain multiple observations to log the individual steps of the execution. Observation can be `Events`, the basic building blocks which are used to track discrete events in a trace, `Spans`, representing durations of units of work in a trace, or `Generations`, used to log model calls. \n",
"\n",
"To log an LLM call, we will first create a trace. In this step, we can also assign the trace metadata such as the a user id or tags.\n"
]
},
{
"cell_type": "code",
"execution_count": 67,
"id": "a68812d5",
"metadata": {},
"outputs": [],
"source": [
"// Creation of a unique trace id. It is optional, but this makes it easier for us to score the trace (add user feedback, etc.) afterwards. \n",
"import { v4 as uuidv4 } from \"npm:uuid\";\n",
"\n",
"const traceId = uuidv4();"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "6987413c",
"metadata": {},
"outputs": [],
"source": [
"// Creation of the trace and assignment of metadata\n",
"const trace = langfuse.trace({\n",
" id: traceId,\n",
" name: \"JS-SDK-Trace\",\n",
" userId: \"user_123456789\",\n",
" metadata: { user: \"[email protected]\" },\n",
" tags: [\"production\"],\n",
"});\n",
" \n",
"// Example update, same params as create, cannot change id\n",
"trace.update({\n",
" metadata: {\n",
" tag: \"long-running\",\n",
" },\n",
"});"
]
},
{
"cell_type": "markdown",
"id": "17ad59cb",
"metadata": {},
"source": [
"## Option 1: Log Any LLM\n",
"\n",
"This part shows how to log an LLM call by passing the model in and outputs via the [Langfuse SDK](https://langfuse.com/docs/sdk/typescript/guide).\n",
"\n",
"We first create an observation of the type `Span` to which we assign the `Generation` observation. This setp is optional but lets us structure the trace.\n",
"\n",
"We then create a observation of the type `Generation` which will be assigned to the `Span` we created earlier. In the second step, we use the Anthropic SDK to call the Clause 3.5 Sonnet model. This step can be replaced with any other LLM SDK.\n",
"\n",
"Lastly, we pass the model output, the mode name and usage metrics to the `Generation`. We can now see this trace in the Langfuse UI."
]
},
{
"cell_type": "code",
"execution_count": 73,
"id": "88ae3efe",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Hello! How can I help you today?\n"
]
}
],
"source": [
"const msg = \"Hello, Claude\";\n",
"\n",
"// Create span\n",
"const span_name = \"Anthropic-Span\";\n",
"const span = trace.span({ name: span_name });\n",
"\n",
"// Example generation creation\n",
"const generation = span.generation({\n",
" name: \"anthropic-generation01\",\n",
" model: \"claude-3-5-sonnet-20241022\",\n",
" input: msg,\n",
"});\n",
" \n",
"// Application code\n",
"import Anthropic from \"npm:@anthropic-ai/sdk\";\n",
"\n",
"const anthropic = new Anthropic({ apiKey: Deno.env.get(\"ANTHROPIC_API_KEY\") });\n",
"\n",
"const chatCompletion = await anthropic.messages.create({\n",
" model: \"claude-3-5-sonnet-20241022\",\n",
" max_tokens: 1024,\n",
" messages: [{ role: \"user\", content: msg }],\n",
"});\n",
" \n",
"\n",
"// Example end - sets endTime, optionally pass a body\n",
"generation.end({\n",
" output: chatCompletion.content[0].text,\n",
" usage: {\n",
" input: chatCompletion.usage.input_tokens,\n",
" output: chatCompletion.usage.output_tokens,\n",
" },\n",
"});\n",
"\n",
"// End span to get span-level latencies\n",
"span.end();\n",
"\n",
"console.log(chatCompletion.content[0].text);\n"
]
},
{
"cell_type": "markdown",
"id": "a44200d4",
"metadata": {},
"source": [
"## Option 2: Using LangChain\n",
"\n",
"This step shows how to trace Langchain applications using the [Langchain integration](https://langfuse.com/docs/integrations/langchain/example-javascript). Since this is a native integration, the model parameters and outputs are automatically captured. We create a new span in our trace and assign the Langchain generation to it by passing `root: span` in the `CallbackHandler`."
]
},
{
"cell_type": "code",
"execution_count": 70,
"id": "1b0ce7b5",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Why did the bear break up with his girlfriend?\n",
"\n",
"Because she was too grizzly for him!\n"
]
}
],
"source": [
"// Create span\n",
"const span_name = \"Langchain-Span\";\n",
"const span = trace.span({ name: span_name });\n",
"\n",
"import { CallbackHandler } from \"npm:langfuse-langchain\"\n",
"const langfuseLangchainHandler = new CallbackHandler({\n",
" root: span,\n",
" publicKey: Deno.env.get(\"LANGFUSE_PUBLIC_KEY\"),\n",
" secretKey: Deno.env.get(\"LANGFUSE_SECRET_KEY\"),\n",
" baseUrl: Deno.env.get(\"LANGFUSE_HOST\"),\n",
" flushAt: 1 // cookbook-only: do not batch events, send them immediately\n",
"})\n",
"\n",
"import { ChatOpenAI } from \"npm:@langchain/openai\"\n",
"import { PromptTemplate } from \"npm:@langchain/core/prompts\"\n",
" \n",
"const model = new ChatOpenAI({});\n",
"const promptTemplate = PromptTemplate.fromTemplate(\n",
" \"Tell me a joke about {topic}\"\n",
");\n",
"\n",
"import { RunnableSequence } from \"npm:@langchain/core/runnables\";\n",
" \n",
"const chain = RunnableSequence.from([promptTemplate, model]);\n",
" \n",
"const res = await chain.invoke(\n",
" { topic: \"bears\" },\n",
" { callbacks: [langfuseLangchainHandler] }\n",
");\n",
"\n",
"// End span to get span-level latencies\n",
"span.end();\n",
" \n",
"console.log(res.content)"
]
},
{
"cell_type": "markdown",
"id": "33e61e01",
"metadata": {},
"source": [
"## Option 3: Using OpenAI\n",
"\n",
"This step shows how to trace OpenAI applications using the [OpenAI integration](https://langfuse.com/docs/integrations/openai/js/get-started). Since this is a native integration, the model parameters and outputs are automatically captured. To add the OpenAI generation to our trace as well, we first create a span and then pass `parent: span` in the `observeOpenAI` function.\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "59a87971",
"metadata": {},
"outputs": [],
"source": [
"// Initialize SDKs\n",
"const openai = new OpenAI();\n",
" \n",
"// Create span\n",
"const span_name = \"OpenAI-Span\";\n",
"const span = trace.span({ name: span_name });\n",
" \n",
"// Call OpenAI\n",
"const joke = (\n",
" await observeOpenAI(openai, {\n",
" parent: span,\n",
" generationName: \"OpenAI-Generation\",\n",
" }).chat.completions.create({\n",
" model: \"gpt-3.5-turbo\",\n",
" messages: [\n",
" { role: \"system\", content: \"Tell me a joke.\" },\n",
" ],\n",
" })\n",
").choices[0].message.content;\n",
" \n",
"// End span to get span-level latencies\n",
"span.end();\n",
" \n",
"// Flush the Langfuse client belonging to the parent span\n",
"await langfuse.flushAsync();"
]
},
{
"cell_type": "markdown",
"id": "29e44ef4",
"metadata": {},
"source": [
"## Step 3: Score the Trace (Optional)\n",
"\n",
"After logging the trace, we can add [scores](https://langfuse.com/docs/scores/custom) to it. This can help in evaluating the quality of the interaction. Scores can be any metric that is important to your application. In this example, we are scoring the trace based on user feedback.\n",
"\n",
"Since the scoring usually happens after the generation is complete, we use our unique trace id to score the trace."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "25d8220d",
"metadata": {},
"outputs": [],
"source": [
"langfuse.score({\n",
" id: traceId,\n",
" name: \"user-feedback\",\n",
" value: 3,\n",
" comment: \"This was a good interaction\",\n",
"});"
]
},
{
"cell_type": "markdown",
"id": "ab1dbbd5",
"metadata": {},
"source": [
"## Step 4: View the Trace in Langfuse"
]
},
{
"cell_type": "markdown",
"id": "fc8185af",
"metadata": {},
"source": [
"![Example trace with the three generations](https://static.langfuse.com/cookbooks/js-sdk-example/js-sdk-example.gif)\n",
"\n",
"[Example trace in the Langfuse UI](https://cloud.langfuse.com/project/cloramnkj0002jz088vzn1ja4/traces/8d580443-519e-4713-9859-eff4a7193f87?timestamp=2024-12-03T17%3A45%3A16.787Z&observation=26ff69ed-8ba8-4bfe-9029-14a179828044&display=details)."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Deno",
"language": "typescript",
"name": "deno"
},
"language_info": {
"codemirror_mode": "typescript",
"file_extension": ".ts",
"mimetype": "text/x.typescript",
"name": "typescript",
"nbconvert_exporter": "script",
"pygments_lexer": "typescript",
"version": "5.6.2"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
1 change: 1 addition & 0 deletions pages/docs/sdk/typescript/_meta.tsx
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
export default {
guide: "Guide",
"guide-web": "Guide (Web)",
"example-notebook": "Example Notebook",
reference: {
title: "Reference ↗",
href: "https://js.reference.langfuse.com",
Expand Down
Loading
Loading