Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Langfuse Connector Failing to Trace Pipeline Data #789

Closed
LTOpen opened this issue Jun 6, 2024 · 2 comments
Closed

Langfuse Connector Failing to Trace Pipeline Data #789

LTOpen opened this issue Jun 6, 2024 · 2 comments

Comments

@LTOpen
Copy link

LTOpen commented Jun 6, 2024

Describe the bug
I attempted to use Langfuse for tracing my pipeline, but it appears that nothing is being traced. Initially, I speculated that the issue might lie with the model. Although I ran the example code provided in the documentation and observed some functionality, it didn't meet expectations. Specifically, the pipeline fails to trace input and output values.

To diagnose the issue, I modified my pipelines multiple times. When I explicitly declare the input and output values, Langfuse traces only the final output value and still fails to capture intermediate values, particularly those related to the LLM model’s input and output.

After numerous tests, I'm still unable to determine the root of the problem. Attached is a screenshot for reference.

My Code

from haystack import Pipeline
from haystack.dataclasses import ChatMessage
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.components.builders.chat_prompt_builder import ChatPromptBuilder
from haystack.components.converters import OutputAdapter
from haystack_integrations.components.connectors.langfuse import LangfuseConnector
import os

os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com"
os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "true"

tracer = LangfuseConnector('haystack-test')

messages = [
    ChatMessage.from_system("""You are a data generator, and you will generate data according to the given format.
---
User: 
Language: English 
Number of words: 1 
Part of speech: Noun 

Assistant: 
Result: Apple
---"""
),
    ChatMessage.from_user("""User:
Language: {{lang}}
Number of words: {{num}}
Part of speech: {{pos}}""")
]

prompt_builder = ChatPromptBuilder(template=messages)

llm = OpenAIChatGenerator(
    model='gpt-3.5-turbo'
)
adapter = OutputAdapter(
    template="""{{replies[0].content}}""",output_type=str
)

gen_pipe = Pipeline()

gen_pipe.add_component("tracer", tracer)
gen_pipe.add_component('prompt',prompt_builder)
gen_pipe.add_component('llm',llm)
gen_pipe.add_component('adapter',adapter)

gen_pipe.connect('prompt.prompt','llm.messages')
gen_pipe.connect('llm.replies','adapter')

response = gen_pipe.run(data={
    'prompt':{
        'template_variables': {
            'lang':'English',
            'num':1,
            'pos':'verb'
        }
    }
})
print(response)
print(response["tracer"]["trace_url"])

Picuture
example code in the documentation works:
image
image

tracing result in my code:
image
image

System:

  • OS: Win10
  • GPU/CPU: Intel/Nvidea
  • Haystack version (commit or version number): 2.2.0
  • Python version: 3.10.13
@anakin87 anakin87 transferred this issue from deepset-ai/haystack Jun 6, 2024
@Redna
Copy link
Contributor

Redna commented Jun 7, 2024

One thing you can try, is to set the environment variable before your actual import of LangfuseConnector. That's the docstring of the LangFuse tracer:

    import os

    os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "true"

    from haystack import Pipeline
    from haystack.components.builders import DynamicChatPromptBuilder
    from haystack.components.generators.chat import OpenAIChatGenerator
    from haystack.dataclasses import ChatMessage
    from haystack_integrations.components.connectors.langfuse import LangfuseConnector

#...

I am setting the variable in my run configuration and content tracing works most of the time fine.

@anakin87
Copy link
Member

@Redna' suggestion seems to point in the right direction.

@LTOpen I am closing the issue. Feel free to reopen it if the problem persists.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants