You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
when using the code_interpreter, it will happen the error, the detail see Screenshots. If I do not use the code_interpreter, it will not have this error.
I can not find the reson.
To Reproduce
python -m taskweaver -p ./project
count the row of ./projects/livis_demo/livis/project/sample_data/demo_data.csv
Expected behavior
A clear and concise description of what you expected to happen. NA if feel not applicable.
Screenshots
=========================================================
_____ _ _ __
|_ _|_ _ ___|| _ || / /__ ____ __ _____ _____
||/ _` / __||/ /|| /| / / _ \/ __ `/ | / / _ \/ ___/
|| (_|\__ \ <||/ |/ / __/ /_/ /||/ / __/ /
|_|\__,_|___/_|\_\|__/|__/\___/\__,_/ |___/\___/_/
=========================================================
TaskWeaver is running in the `local` mode. This implies that the code execution service will run on the same machine as the TaskWeaver server. For better security, it is recommended to run the code execution service in the `container` mode. More information can be found in the documentation (https://microsoft.github.io/TaskWeaver/docs/code_execution/).
--- new session starts ---
TaskWeaver ▶ I am TaskWeaver, an AI assistant. To get started, could you please enter your request?
Human ▶ count the row of ./projects/livis_demo/livis/project/sample_data/demo_data.csv
╭───< Planner >
├─► [init_plan]
│ 1. load the data file
│ 2. count the rows of the loaded data <sequentially depends on 1>
│ 3. report the result to the user <interactively depends on 2>
├─► [plan]
│ 1. instruct CodeInterpreter to load the data file and count the rows of the loaded data
│ 2. report the result to the user
├─► [current_plan_step] 1. instruct CodeInterpreter to load the data file and count the rows of the loaded data
├──● Please load the data file ./projects/livis_demo/livis/project/sample_data/demo_data.csv and count the rows of the loaded data
├─► [board]
│ I have drawn up a plan:
│ 1. instruct CodeInterpreter to load the data file and count the rows of the loaded data
│ 2. report the result to the user
│
│ Please proceed with this step of this plan: Please load the data file ./projects/livis_demo/livis/project/sample_data/demo_data.csv and count the rows of the loaded data
╰──● sending message to CodeInterpreter
╭───< CodeInterpreter >
╰──● sending message to
Error: Cannot process your request due to Exception: peer closed connection without sending complete message body (incomplete chunked read)
Traceback (most recent call last):
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
yield
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpx/_transports/default.py", line 113, in __iter__
forpartin self._httpcore_stream:
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpcore/_sync/connection_pool.py", line 367, in __iter__
raise exc from None
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpcore/_sync/connection_pool.py", line 363, in __iter__
forpartin self._stream:
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpcore/_sync/http11.py", line 349, in __iter__
raise exc
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpcore/_sync/http11.py", line 341, in __iter__
forchunkin self._connection._receive_response_body(**kwargs):
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpcore/_sync/http11.py", line 210, in _receive_response_body
event = self._receive_event(timeout=timeout)
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpcore/_sync/http11.py", line 220, in _receive_event
with map_exceptions({h11.RemoteProtocolError: RemoteProtocolError}):
File "/home/miniconda3/envs/taskweaver/lib/python3.10/contextlib.py", line 153, in __exit__
self.gen.throw(typ, value, traceback)
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/work/projects/livis_demo/livis/taskweaver/session/session.py", line 202, in _send_text_message
post = _send_message(post.send_to, post)
File "/home/work/projects/livis_demo/livis/taskweaver/module/tracing.py", line 174, in wrapper
return func(*args, **kwargs)
File "/home/work/projects/livis_demo/livis/taskweaver/session/session.py", line 182, in _send_message
reply_post = self.worker_instances[recipient].reply(
File "/home/work/projects/livis_demo/livis/taskweaver/module/tracing.py", line 186, in wrapper
return func(self, *args, **kwargs)
File "/home/work/projects/livis_demo/livis/taskweaver/code_interpreter/code_interpreter/code_interpreter.py", line 137, in reply
self.generator.reply(
File "/home/work/projects/livis_demo/livis/taskweaver/module/tracing.py", line 186, in wrapper
return func(self, *args, **kwargs)
File "/home/work/projects/livis_demo/livis/taskweaver/code_interpreter/code_interpreter/code_generator.py", line 378, in reply
self.post_translator.raw_text_to_post(
File "/home/work/projects/livis_demo/livis/taskweaver/role/translator.py", line 85, in raw_text_to_post
fortype_str, value, is_endin parser_stream:
File "/home/work/projects/livis_demo/livis/taskweaver/role/translator.py", line 277, in parse_llm_output_stream_v2
forevin parser:
File "/home/work/projects/livis_demo/livis/taskweaver/utils/json_parser.py", line 438, in parse_json_stream
forchunkin itertools.chain(token_stream, [None]):
File "/home/work/projects/livis_demo/livis/taskweaver/role/translator.py", line 57, in stream_filter
forcin s:
File "/home/work/projects/livis_demo/livis/taskweaver/llm/__init__.py", line 292, in _stream_smoother
raise llm_source_error # type:ignore
File "/home/work/projects/livis_demo/livis/taskweaver/llm/__init__.py", line 240, in base_stream_puller
formsgin stream:
File "/home/work/projects/livis_demo/livis/taskweaver/llm/openai.py", line 179, in chat_completion
forstream_resin res:
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/openai/_streaming.py", line 46, in __iter__
foritemin self._iterator:
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/openai/_streaming.py", line 58, in __stream__
forssein iterator:
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/openai/_streaming.py", line 50, in _iter_events
yield from self._decoder.iter_bytes(self.response.iter_bytes())
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/openai/_streaming.py", line 280, in iter_bytes
forchunkin self._iter_chunks(iterator):
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/openai/_streaming.py", line 291, in _iter_chunks
forchunkin iterator:
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpx/_models.py", line 829, in iter_bytes
forraw_bytesinself.iter_raw():
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpx/_models.py", line 883, in iter_raw
forraw_stream_bytesin self.stream:
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpx/_client.py", line 126, in __iter__
forchunkin self._stream:
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpx/_transports/default.py", line 112, in __iter__
with map_httpcore_exceptions():
File "/home/miniconda3/envs/taskweaver/lib/python3.10/contextlib.py", line 153, in __exit__
self.gen.throw(typ, value, traceback)
File "/home/miniconda3/envs/taskweaver/lib/python3.10/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)
Environment Information (please complete the following information):
OS: linux, openai==1.34.0
Python Version : 3.10
LLM that you're using: openai_api_server from llama_factury
Other Configurations except the LLM api/key related:
The text was updated successfully, but these errors were encountered:
Is this a constant issue or only happen occasionally? I run a similar query with azure openai's endpoint and cannot reproduce this issue. Or, can you try a different endpoint and sometimes the inferencing framework is the root cause.
Is this a constant issue or only happen occasionally? I run a similar query with azure openai's endpoint and cannot reproduce this issue. Or, can you try a different endpoint and sometimes the inferencing framework is the root cause.
Describe the bug
when using the
code_interpreter
, it will happen the error, the detail seeScreenshots
. If I do not use thecode_interpreter
, it will not have this error.I can not find the reson.
To Reproduce
Expected behavior
A clear and concise description of what you expected to happen. NA if feel not applicable.
Screenshots
Environment Information (please complete the following information):
The text was updated successfully, but these errors were encountered: