Skip to content

Commit

Permalink
feat: proving better error message for unsupported max_prompt_tokens (#…
Browse files Browse the repository at this point in the history
  • Loading branch information
adubovik authored Sep 6, 2024
1 parent f60c185 commit f4a1b58
Showing 1 changed file with 5 additions and 0 deletions.
5 changes: 5 additions & 0 deletions aidial_adapter_vertexai/chat_completion.py
Original file line number Diff line number Diff line change
Expand Up @@ -80,6 +80,11 @@ async def chat_completion(self, request: Request, response: Response):

discarded_messages: List[int] = []
if params.max_prompt_tokens is not None:
if not is_implemented(model.truncate_prompt):
raise ValidationError(
"max_prompt_tokens request parameter is not supported"
)

prompt, discarded_messages = await model.truncate_prompt(
prompt, params.max_prompt_tokens
)
Expand Down

0 comments on commit f4a1b58

Please sign in to comment.