-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: TypeError: can only concatenate str (not "dict") to str #6958
Comments
can you share the request being made to litellm for repro? |
I also got the same issue using ollama_chat/llama3.1 running locally (Meta Llama 3.1 8B Instruct). Interestingly ollama/llama3.1 did not have the same issue - I expect because it does not track consumed tokens. Adding str(function_arguments) solved the issue. |
@krrishdholakia yes . I make a simple demo to reproduce this error. |
I think we're hitting this too as a downstream user of litellm - see robusta-dev/holmesgpt#246 |
What happened?
environment
Infomation
I use autogen+litellm+ollama for my local test.
when doing some tool_call, litellm raise ERROR in token_counter method : TypeError: can only concatenate str (not "dict") to str.
I debug it with vscode , I found the function_arguments is dict , not str.
Can some one help check it is a bug ?
Can just change it from (which at line 1638 of utils.py file):
text += function_arguments
to
text += str(function_arguments)
Relevant log output
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: