-
Notifications
You must be signed in to change notification settings - Fork 71
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
docs: add changelog for colorcoded trace and cost text (#855)
--------- Co-authored-by: Marc Klingen <[email protected]>
- Loading branch information
1 parent
9974de5
commit 4906891
Showing
1 changed file
with
21 additions
and
0 deletions.
There are no files selected for viewing
21 changes: 21 additions & 0 deletions
21
pages/changelog/2024-10-10-text-color-for-latency-and-costs.mdx
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,21 @@ | ||
--- | ||
date: 2024-10-10 | ||
title: Aggregated and Color-coded Latency and Costs on Traces | ||
description: Large traces can be hard to read. We've added aggregated latency and cost information to the every span level to make it easier to spot outliers and debug the LLM application. | ||
author: Marc | ||
ogCloudflareVideo: 42946e11682678869acd283f73bd1048 | ||
--- | ||
|
||
import { ChangelogHeader } from "@/components/changelog/ChangelogHeader"; | ||
|
||
<ChangelogHeader /> | ||
|
||
1. All spans in a trace now show **aggregated latency and cost information** from nested observations. Learn more about the tracing data model [here](/docs/tracing). | ||
|
||
2. **Optionally, you can enable color-coding** to make it easier to detect outliers and debug the LLM application. You can toggle this via the `%` button in the top right corner of the trace view. Based on the ratio of the spans latency/cost to the total trace latency/cost, the color is determined as follows: | ||
|
||
- Red: When the ratio is 75% or higher. | ||
- Yellow: When the ratio is between 50% and 75%. | ||
- No color: When the ratio is below 50%. | ||
|
||
This change was based on feedback from a YC company building complex LLM agents. If you think the Langfuse interface could be improved to help you understand and debug your LLM applications, please share this with us via [GitHub](/ideas)! |