-
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Issues: BerriAI/litellm
[Feature]:
aiohttp
migration - 10-100x Higher RPS Master ti...
#7544
opened Jan 4, 2025 by
ishaan-jaff
Open
3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Bug]: VertexAI custom model does not pick up uploaded token
bug
Something isn't working
mlops user request
#8597
opened Feb 17, 2025 by
suresiva
[Bug]: KeyError: 'name' error with local ollama models
bug
Something isn't working
#8594
opened Feb 17, 2025 by
hajdul88
[Bug]: _return_huggingface_tokenizer missing models [Patch]
bug
Something isn't working
#8587
opened Feb 17, 2025 by
Mte90
[Bug]: Usage UI doesn't have a monthly filter
bug
Something isn't working
#8586
opened Feb 17, 2025 by
Mte90
[Bug]: UI Home page and Usage shows different Total spend
bug
Something isn't working
#8585
opened Feb 17, 2025 by
Mte90
[Bug]: Cannot set internal_user max_budget to 0 via UI or API
bug
Something isn't working
#8584
opened Feb 17, 2025 by
domg8man
[Bug]: 'utf-8' codec can't encode characters in position
bug
Something isn't working
#8583
opened Feb 17, 2025 by
vincent-amchrus
[Bug]: double logging and inconsistent logs in langfuse
bug
Something isn't working
#8581
opened Feb 17, 2025 by
ksundarraj-c-rpx
[Bug]: Cost calculation for Gemini 2 Flash on Vertex raises ">128k tokens" error
bug
Something isn't working
#8579
opened Feb 17, 2025 by
mehertz
[Bug]: Model hub not showing customized costs
bug
Something isn't working
mlops user request
#8573
opened Feb 16, 2025 by
xmcp
[Bug]: x-litellm-cache-key header not being returned on cache hit
bug
Something isn't working
#8570
opened Feb 16, 2025 by
mirodrr2
[Feature]: support to selectively drop or selectively send traces
enhancement
New feature or request
mlops user request
#8569
opened Feb 16, 2025 by
deepakdeore2004
[Bug]: Can't stream Deepseek on Vertex AI Model Garden
bug
Something isn't working
feb 2025
#8564
opened Feb 15, 2025 by
emorling
[Bug]: LiteLLM not correctly handling several datatypes originally supported by Gemini
bug
Something isn't working
#8557
opened Feb 15, 2025 by
alymedhat10
[Bug]: When using hosted_vLLM, the generic token counter is used even if custom_tokenizer is configured
bug
Something isn't working
#8555
opened Feb 15, 2025 by
m4oc
[Feature]: Add Multi-Modal Output Support (image-to-image, image-to-video, text-to-video)
enhancement
New feature or request
#8548
opened Feb 14, 2025 by
mvrodrig
[Bug]: function_to_dict() fails enums
bug
Something isn't working
#8539
opened Feb 14, 2025 by
vlerenc
base_model silently being dropped if it's not one of a small number of models
#8538
opened Feb 14, 2025 by
andrewbolster
[Bug]: Model Not Mapped Yet - Unable to Use
awaiting: user response
bug
Something isn't working
#8536
opened Feb 14, 2025 by
Henrik404
Error: Sagemaker error Too little data for declared Content-Length
#8534
opened Feb 14, 2025 by
massi-ang
Fix : showing Create new key button still when i am in Edit key mode.
#8531
opened Feb 14, 2025 by
tahaali-dev
Fix: The horizontal scroll list overflows, causing content to be cut off and details to not display properly.
#8530
opened Feb 14, 2025 by
tahaali-dev
[Bug]: Bug in litellm.utils for checking function calling
bug
Something isn't working
mlops user request
#8521
opened Feb 13, 2025 by
kdziedzic68
Cached / multimodal token are not passed through to Langfuse
bug
Something isn't working
#8515
opened Feb 13, 2025 by
hassiebp
Previous Next
ProTip!
Follow long discussions with comments:>50.