You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
AgentScope is an open-source project. To involve a broader community, we recommend asking your questions in English.
Describe the bug
conversation_in_stream_mode when use config stream_ollama,
File "/Users/xxxx/work/agent/conversationinstream.py", line 114, in
msg = agent(msg)
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/agents/agent.py", line 135, in call
res = self.reply(*args, **kwargs)
File "/Users/xxxx/work/agent/conversationinstream.py", line 90, in reply
self.speak(res.stream)
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/agents/agent.py", line 174, in speak
for last, text_chunk in content:
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/models/response.py", line 95, in _stream_generator_wrapper
for text in self._stream:
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/models/ollama_model.py", line 213, in generator
self._save_model_invocation_and_update_monitor(
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/models/ollama_model.py", line 258, in _save_model_invocation_and_update_monitor
self._save_model_invocation(
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/models/model.py", line 379, in _save_model_invocation
FileManager.get_instance().save_api_invocation(
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/manager/_file.py", line 185, in save_api_invocation
json.dump(record, file, indent=4, ensure_ascii=False)
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/json/init.py", line 179, in dump
for chunk in iterable:
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/json/encoder.py", line 431, in _iterencode
yield from _iterencode_dict(o, _current_indent_level)
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/json/encoder.py", line 405, in _iterencode_dict
yield from chunks
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/json/encoder.py", line 438, in _iterencode
o = _default(o)
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/json/encoder.py", line 179, in default
raise TypeError(f'Object of type {o.class.name} '
TypeError: Object of type ChatResponse is not JSON serializable
To Reproduce
Steps to reproduce the behavior:
change
YOUR_SELECTED_MODEL_CONFIG_NAME = "stream_ds"
to
YOUR_SELECTED_MODEL_CONFIG_NAME = "stream_ollama"
and run main.py
Expected behavior
A clear and concise description of what you expected to happen.
Error messages
TypeError: Object of type ChatResponse is not JSON serializable
Environment (please complete the following information):
AgentScope Version: [e.g. 0.0.2 via print(agentscope.__version__)]
Python Version: [e.g. 3.9]
OS: [e.g. macos, windows]
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered:
AgentScope is an open-source project. To involve a broader community, we recommend asking your questions in English.
Describe the bug
conversation_in_stream_mode when use config stream_ollama,
File "/Users/xxxx/work/agent/conversationinstream.py", line 114, in
msg = agent(msg)
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/agents/agent.py", line 135, in call
res = self.reply(*args, **kwargs)
File "/Users/xxxx/work/agent/conversationinstream.py", line 90, in reply
self.speak(res.stream)
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/agents/agent.py", line 174, in speak
for last, text_chunk in content:
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/models/response.py", line 95, in _stream_generator_wrapper
for text in self._stream:
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/models/ollama_model.py", line 213, in generator
self._save_model_invocation_and_update_monitor(
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/models/ollama_model.py", line 258, in _save_model_invocation_and_update_monitor
self._save_model_invocation(
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/models/model.py", line 379, in _save_model_invocation
FileManager.get_instance().save_api_invocation(
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/site-packages/agentscope/manager/_file.py", line 185, in save_api_invocation
json.dump(record, file, indent=4, ensure_ascii=False)
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/json/init.py", line 179, in dump
for chunk in iterable:
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/json/encoder.py", line 431, in _iterencode
yield from _iterencode_dict(o, _current_indent_level)
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/json/encoder.py", line 405, in _iterencode_dict
yield from chunks
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/json/encoder.py", line 438, in _iterencode
o = _default(o)
File "/Users/xxxx/miniconda3/envs/agentscope/lib/python3.9/json/encoder.py", line 179, in default
raise TypeError(f'Object of type {o.class.name} '
TypeError: Object of type ChatResponse is not JSON serializable
To Reproduce
Steps to reproduce the behavior:
change
to
and run main.py
Expected behavior
A clear and concise description of what you expected to happen.
Error messages
TypeError: Object of type ChatResponse is not JSON serializable
Environment (please complete the following information):
print(agentscope.__version__)
]Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: