From 8feb4c2a000f322e545ed5d476770e4c99638d57 Mon Sep 17 00:00:00 2001 From: Marc Klingen Date: Tue, 17 Dec 2024 13:13:03 +0100 Subject: [PATCH] add diagrams --- ...2-langfuse-v3-infrastructure-evolution.mdx | 24 ++++++++++++++++++- 1 file changed, 23 insertions(+), 1 deletion(-) diff --git a/pages/blog/2024-12-langfuse-v3-infrastructure-evolution.mdx b/pages/blog/2024-12-langfuse-v3-infrastructure-evolution.mdx index e78f2fafe..a27fce306 100644 --- a/pages/blog/2024-12-langfuse-v3-infrastructure-evolution.mdx +++ b/pages/blog/2024-12-langfuse-v3-infrastructure-evolution.mdx @@ -17,10 +17,32 @@ import Link from "next/link"; authors={["steffenschmitz", "maxdeichmann"]} /> -Langfuse, the open-source LLM observability platform, emerged from the Y Combinator Winter 2023 batch. We worked closely with a few of our batchmates to quickly develop a v0 LLM observability platform that satisfied a few core criteria: SDKs were asynchronous, based on tracing, as well as open source and easily self-hostable. The first version was written on NextJs, Vercel, and Postgres. Little did we know we would rapidly evolve from an experiment to processing tens of thousands of events per minute. +Langfuse, the open-source LLM observability platform, emerged from the Y Combinator Winter 2023 batch. After building many LLM applications ourselves and realizing that it is hard to go from demo to production, we worked closely with a few of our batchmates to quickly develop a v0 LLM observability platform. We fouced on getting a few core features right: SDKs were asynchronous, based on tracing, as well as open source and easily self-hostable. The first version was written on NextJs, Vercel, and Postgres. Little did we know we would rapidly evolve from an experiment to processing tens of thousands of events per minute. Our recent V3 release marks a significant milestone in ensuring Langfuse can scale for all of our users. Additionally, it unlocks capabilities for self-hosting users. Features like online evaluations, async ingestion, and cached prompts are now also available in self-hosting. In this post, we will walk you through the scaling challenges we faced while building Langfuse and how our “hypothesis - experiment - feedback” loop helped us arrive at Langfuse v3. If you are interested in solving similar challenges with us - we are hiring in Berlin! +import ArchitectureDiagramV2 from "@/components-mdx/architecture-diagram-v2.mdx"; +import ArchitectureDiagramV3 from "@/components-mdx/architecture-diagram-v3.mdx"; +import ArchitectureDescriptionV3 from "@/components-mdx/architecture-description-v3.mdx"; + + + + +Architecture Diagram + + + + + + + +Architecture Diagram + + + + + + ## Challenges ### Challenge 1: Building a Resilient High-Throughput Ingestion Pipeline