-
Notifications
You must be signed in to change notification settings - Fork 126
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Langfuse integration #646
Comments
@deepset-ai/devrel FYI this will be scheduled for the upcoming sprint |
@vblagoje reopening as the task is not done, please make sure the tasklist is up to date! |
Ok will do @masci - it got closed because of the fixes link on PR. All subtasks are finished except docs publishing (although they are completed) and social media announcements |
@dfokina @annthurium When the respective sub tasks are completed would you please mark them on this issue |
The docs are live: https://docs.haystack.deepset.ai/docs/langfuseconnector |
I followed the documentation but the result is Here is my code from haystack.dataclasses import ChatMessage
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.components.builders import DynamicChatPromptBuilder
from haystack import Pipeline
from haystack_integrations.components.connectors.langfuse import LangfuseConnector
from haystack import Pipeline
import os
os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-...."
os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-...."
os.environ["OPENAI_API_KEY"] = "sk-...."
os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com"
os.environ["TOKENIZERS_PARALLELISM"] = "false"
os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "true"
pipe = Pipeline()
pipe.add_component("tracer", LangfuseConnector("Chat example"))
pipe.add_component("prompt_builder", DynamicChatPromptBuilder())
pipe.add_component("llm", OpenAIChatGenerator(model="gpt-3.5-turbo"))
pipe.connect("prompt_builder.prompt", "llm.messages")
messages = [
ChatMessage.from_system("Always respond in German even if some input data is in other languages."),
ChatMessage.from_user("Tell me about {{location}}"),
]
response = pipe.run(
data={"prompt_builder": {"template_variables": {"location": "Berlin"}, "prompt_source": messages}}
)
print(response["llm"]["replies"][0])
print(response["tracer"]["trace_url"])
# ChatMessage(content='Berlin ist die Hauptstadt von Deutschland und zugleich eines der bekanntesten kulturellen Zentren Europas. Die Stadt hat eine faszinierende Geschichte, die bis in die Zeiten des Zweiten Weltkriegs und des Kalten Krieges zurückreicht. Heute ist Berlin für seine vielfältige Kunst- und Musikszene, seine historischen Stätten wie das Brandenburger Tor und die Berliner Mauer sowie seine lebendige Street-Food-Kultur bekannt. Berlin ist auch für seine grünen Parks und Seen beliebt, die den Bewohnern und Besuchern Raum für Erholung bieten.', role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'gpt-3.5-turbo-0125', 'index': 0, 'finish_reason': 'stop', 'usage': {'completion_tokens': 137, 'prompt_tokens': 29, 'total_tokens': 166}})
# https://cloud.langfuse.com/trace/YOUR_UNIQUE_IDENTIFYING_STRING |
Hey @caubechankiu due to python class loading mechanism you need to import os first, set env values and then import everything else. Then it'll work. Also select the nodes below chat example (right panel) as the root doesn't have input only haystack.pipeline.run node and below. See below: And let us know if adjusting these worked. Here is my pip install on fresh virtual env:
|
Thank you! It worked. |
@annthurium would you please close this issue after social media announcement |
yep! |
Ok, I think we can finally close this one @annthurium @masci |
Summary and motivation
Langfuse would be extremely helpful in monitoring and debugging Haystack pipelines, by visualising inputs and outputs for each component execution during a
pipeline.run()
.Detailed design
Checklist
If the request is accepted, ensure the following checklist is complete before closing this issue.
Tasks
The text was updated successfully, but these errors were encountered: