You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
...but there is no example (that I can find) of creating a streaming chat engine from an index object as shown in the LlamaIndex examples:
chat_engine=index.as_chat_engine()
streaming_response=chat_engine.stream_chat("Tell me a joke.")
fortokeninstreaming_response.response_gen:
print(token, end="")
If I try to use chat_engine.stream_chat with the for response in client.chat.completions.create() pattern as shown in the Streamlit docs, I get the following error RuntimeError: There is no current event loop in thread 'ScriptRunner.scriptThread'.
The text was updated successfully, but these errors were encountered:
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 534, in _run_script
exec(code, module.__dict__)
File "/workspaces/gcp_llm/app.py", line 67, in <module>
for token in st.session_state.chat_engine.stream_chat(prompt):
File "/usr/local/lib/python3.9/site-packages/llama_index/callbacks/utils.py", line 39, in wrapper
return func(self, *args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/llama_index/agent/openai_agent.py", line 444, in stream_chat
chat_response = self._chat(
File "/usr/local/lib/python3.9/site-packages/llama_index/agent/openai_agent.py", line 330, in _chat
agent_chat_response = self._get_agent_response(mode=mode, **llm_chat_kwargs)
File "/usr/local/lib/python3.9/site-packages/llama_index/agent/openai_agent.py", line 295, in _get_agent_response
return self._get_stream_ai_response(**llm_chat_kwargs)
File "/usr/local/lib/python3.9/site-packages/llama_index/agent/openai_agent.py", line 196, in _get_stream_ai_response
chat_stream_response = StreamingAgentChatResponse(
File "<string>", line 10, in __init__
File "/usr/local/lib/python3.9/asyncio/queues.py", line 36, in __init__
self._loop = events.get_event_loop()
File "/usr/local/lib/python3.9/asyncio/events.py", line 642, in get_event_loop
raise RuntimeError('There is no current event loop in thread %r.'
RuntimeError: There is no current event loop in thread 'ScriptRunner.scriptThread'.
The Streamlit docs on creating a streaming chatbot show the following example:
...but there is no example (that I can find) of creating a streaming chat engine from an index object as shown in the LlamaIndex examples:
If I try to use
chat_engine.stream_chat
with thefor response in client.chat.completions.create()
pattern as shown in the Streamlit docs, I get the following errorRuntimeError: There is no current event loop in thread 'ScriptRunner.scriptThread'
.The text was updated successfully, but these errors were encountered: