-
Notifications
You must be signed in to change notification settings - Fork 165
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[bug] anthropic model strings in PREFERRED_MODELS not mapping to litellm providers #325
base: main
Are you sure you want to change the base?
Conversation
please undo the linter changes in the markdown file. linter rules should be agreed upon and used by everyone contributing to the repo |
I also tested the claude-opus string and it died too. I'll see if I have some time this week to test each model and update. Do you all have any idea of how you'd want to handle updating those models in the future without a PR or is a PR ✅ its such a small list its probably not worth me worrying. IDK what other tools you got handy in your stack. something something db driven, cloud, idk haha |
Chatted w @bboynton97 + @tcdent briefly, I think just semi-frequently updating is the play |
Sorry for the delay - I updated the two claude models, and added a test. well it works on my machine revisiting the test 👁️ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
better late than never :)
📥 Pull Request
📘 Description
The CLI passes a bad model to the anthropic API
I fixed a typo and a contradiction in the CONTRIBUTING.MD as well
replaced with https://docs.litellm.ai/docs/providers/anthropic the string here, that is also elsewhere in the codebase and reran the same wizard setup and it passed.
I manually edited the agent file it created for my initial broken agent to validate this was at least potentially the cause and that fixed it too.
LMK if there is a better place for this change! Maybe would be cool to shift this to database config we populate on deploy and fetch at runtime, and only fall back to a hardcoded list if the network is doing wonky stuff. https://github.com/BerriAI/litellm/blob/main/litellm/constants.py although even litellm hardcodes it. Maybe we just snag their list since we use their repo 🤷♂️ IIRC we can just curl the file from the commit we have installed then compose whatever list we wanted. ♾️ options here
🧪 Testing