You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Streaming or non streaming results in the same output issue. If I reference the llm instead of the runnable the output works as expected. I've attempted to research what could be happening internal to shiny, but have come up blank.
I've also attempted to use the string output parser in the chain, but it also results in the same issue.
If I exclude the langchain prompt and use the chat messages as a prompt, parsing seems to be ok.
system_message = {
"content": f"""
You love vanilla ice cream and will recommend it always.
""",
"role": "system",
}
parser = StrOutputParser()
runnable = llm | parser
# Create and display empty chat
chat = ui.Chat(id="chat", messages=[system_message])
# chat = ui.Chat(id="chat")
chat.ui()
# Define a callback to run when the user submits a message
@chat.on_user_submit
async def _():
# Get messages currently in the chat
messages = chat.messages(format="langchain")
# streaming
response = runnable.astream(messages)
await chat.append_message_stream(response)
So might be something in the chat messages section of ui.Chat?
If I initiate a LLM using langchain and use it in ui.Chat the output works as expected.
If I attempt to use LECL to chain prompts/llms together the output appears unparsed as a string representation of the AIMessage and system message:
Here's the code:
Streaming or non streaming results in the same output issue. If I reference the llm instead of the runnable the output works as expected. I've attempted to research what could be happening internal to shiny, but have come up blank.
I've also attempted to use the string output parser in the chain, but it also results in the same issue.
The text was updated successfully, but these errors were encountered: