Unsetting litellm_params
from the client per request?
#3292
-
On the client side, how would I "unset" model_list:
- model_name: gpt-4-0125-preview
litellm_params:
model: azure/gpt-4-0125-preview
api_key: os.environ/AZURE_API_KEY_EASTUS
api_version: "2024-03-01-preview"
api_base: "https://aimoda-is-epic.openai.azure.com"
seed: 1337 i.e. instead of
|
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 6 replies
-
hey @Manouchehri missed this
-- Would that work? |
Beta Was this translation helpful? Give feedback.
-
@Manouchehri you can now do this by specifying which params you want to drop - litellm_params:
api_base: my-base
model: openai/my-model
additional_drop_params: ["response_format"] # 👈 KEY CHANGE
model_name: my-model |
Beta Was this translation helpful? Give feedback.
-
To control this per request, use curl -v "${OPENAI_API_BASE}/chat/completions" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
"model": "gpt-3.5-turbo",
"seed": null,
"messages": [
{
"role": "user",
"content": "what is your name"
}
]
}' |
Beta Was this translation helpful? Give feedback.
To control this per request, use
null
.