Skip to content

Commit

Permalink
add diagrams
Browse files Browse the repository at this point in the history
  • Loading branch information
marcklingen committed Dec 17, 2024
1 parent 77c2fe4 commit 8feb4c2
Showing 1 changed file with 23 additions and 1 deletion.
24 changes: 23 additions & 1 deletion pages/blog/2024-12-langfuse-v3-infrastructure-evolution.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,32 @@ import Link from "next/link";
authors={["steffenschmitz", "maxdeichmann"]}
/>

Langfuse, the open-source LLM observability platform, emerged from the Y Combinator Winter 2023 batch. We worked closely with a few of our batchmates to quickly develop a v0 LLM observability platform that satisfied a few core criteria: SDKs were asynchronous, based on tracing, as well as open source and easily self-hostable. The first version was written on NextJs, Vercel, and Postgres. Little did we know we would rapidly evolve from an experiment to processing tens of thousands of events per minute.
Langfuse, the open-source LLM observability platform, emerged from the Y Combinator Winter 2023 batch. After building many LLM applications ourselves and realizing that it is hard to go from demo to production, we worked closely with a few of our batchmates to quickly develop a v0 LLM observability platform. We fouced on getting a few core features right: SDKs were asynchronous, based on tracing, as well as open source and easily self-hostable. The first version was written on NextJs, Vercel, and Postgres. Little did we know we would rapidly evolve from an experiment to processing tens of thousands of events per minute.
Our recent V3 release marks a significant milestone in ensuring Langfuse can scale for all of our users. Additionally, it unlocks capabilities for self-hosting users. Features like online evaluations, async ingestion, and cached prompts are now also available in self-hosting.
In this post, we will walk you through the scaling challenges we faced while building Langfuse and how our “hypothesis - experiment - feedback” loop helped us arrive at Langfuse v3. If you are interested in solving similar challenges with us - we are hiring in Berlin!

import ArchitectureDiagramV2 from "@/components-mdx/architecture-diagram-v2.mdx";
import ArchitectureDiagramV3 from "@/components-mdx/architecture-diagram-v3.mdx";
import ArchitectureDescriptionV3 from "@/components-mdx/architecture-description-v3.mdx";

<Tabs items={["Langfuse v3", "Langfuse v2"]}>
<Tab>

Architecture Diagram

<ArchitectureDiagramV3 />

</Tab>

<Tab>

Architecture Diagram

<ArchitectureDiagramV2 />

</Tab>
</Tabs>

## Challenges

### Challenge 1: Building a Resilient High-Throughput Ingestion Pipeline
Expand Down

0 comments on commit 8feb4c2

Please sign in to comment.