You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the project.toml, we do not require a specific version of openai and litellm. If a user directly runs pip install ., then the newest version of openai may not support import openai.error. However, we use this in our api_tools.py. If we choose to use a lower version of openai, it is not supported by litellm. Thus we need to refactor our api_tools.py to support new version of openai and litellm.
The text was updated successfully, but these errors were encountered:
I think this has become more important now. I was trying to use base model as gpt-4-turbo-preview but was getting litellm errors saying model could not be mapped correctly. I believe this is due to the mismatch. @zhaochenyang20 are you refactoring this?
In the
project.toml
, we do not require a specific version ofopenai
andlitellm
. If a user directly runspip install .
, then the newest version ofopenai
may not supportimport openai.error
. However, we use this in ourapi_tools.py
. If we choose to use a lower version ofopenai
, it is not supported bylitellm
. Thus we need to refactor ourapi_tools.py
to support new version ofopenai
andlitellm
.The text was updated successfully, but these errors were encountered: