Skip to content

Commit

Permalink
Add comprehensive README
Browse files Browse the repository at this point in the history
  • Loading branch information
vblagoje committed Apr 26, 2024
1 parent 9ed1d2c commit 0755886
Showing 1 changed file with 74 additions and 9 deletions.
83 changes: 74 additions & 9 deletions integrations/langfuse/README.md
Original file line number Diff line number Diff line change
@@ -1,22 +1,87 @@
# langfuse
# langfuse-haystack

[![PyPI - Version](https://img.shields.io/pypi/v/langfuse-haystack.svg)](https://pypi.org/project/langfuse-haystack)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/langfuse-haystack.svg)](https://pypi.org/project/langfuse-haystack)

-----
langfuse-haystack integrates tracing capabilities into [Haystack](https://github.com/deepset-ai/haystack) (2.x) pipelines using [Langfuse](https://langfuse.com/). This package enhances the visibility of pipeline runs by capturing comprehensive details of the execution traces, including API calls, context data, prompts, and more. Whether you're monitoring model performance, pinpointing areas for improvement, or creating datasets for fine-tuning and testing from your pipeline executions, langfuse-haystack is the right tool for you.
## Features

**Table of Contents**

- [langfuse](#langfuse)
- [Installation](#installation)
- [License](#license)
- Easy integration with Haystack pipelines
- Capture the full context of the execution
- Track model usage and cost
- Collect user feedback
- Identify low-quality outputs
- Build fine-tuning and testing datasets

## Installation

```console
To install langfuse-haystack, simply run the following command:

```sh
pip install langfuse-haystack
```

## Usage

To enable tracing in your Haystack pipeline, you need to add the `LangfuseComponent` to your pipeline. Here's an example:

```python
import os

os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com"
os.environ["TOKENIZERS_PARALLELISM"] = "false"
os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "true"

from haystack.components.builders import DynamicChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack import Pipeline

from haystack_integrations.components.others.langfuse import LangfuseComponent

if __name__ == "__main__":
pipe = Pipeline()
pipe.add_component("tracer", LangfuseComponent("Chat example"))
pipe.add_component("prompt_builder", DynamicChatPromptBuilder())
pipe.add_component("llm", OpenAIChatGenerator(model="gpt-3.5-turbo"))

pipe.connect("prompt_builder.prompt", "llm.messages")

messages = [
ChatMessage.from_system("Always respond in German even if some input data is in other languages."),
ChatMessage.from_user("Tell me about {{location}}"),
]

response = pipe.run(
data={"prompt_builder": {"template_variables": {"location": "Berlin"}, "prompt_source": messages}}
)
print(response["llm"]["replies"][0])
print(response["tracer"]["trace_url"])
```

In this example, we add the `LangfuseComponent` to the pipeline with the name "tracer". Each run of the pipeline produces one trace viewable on the Langfuse website with a specific URL. The trace captures the entire execution context, including the prompts, completions, and metadata.

## Trace Visualization

Langfuse provides a user-friendly interface to visualize and analyze the traces generated by your Haystack pipeline. Simply login into your Langfuse account and navigate to the trace URL to view the trace details.

## Contributing

`hatch` is the best way to interact with this project, to install it:
```sh
pip install hatch
```

With `hatch` installed, to run all the tests:
```
hatch run test
```

To run the linters `ruff` and `mypy`:
```
hatch run lint:all
```

## License

`langfuse` is distributed under the terms of the [Apache-2.0](https://spdx.org/licenses/Apache-2.0.html) license.
`langfuse-haystack` is distributed under the terms of the [Apache-2.0](https://spdx.org/licenses/Apache-2.0.html) license.

0 comments on commit 0755886

Please sign in to comment.