Skip to content

Commit

Permalink
Merge pull request #1295 from Codium-ai/tr/azure_o1
Browse files Browse the repository at this point in the history
fix: correct model type extraction for O1 model handling
  • Loading branch information
mrT23 authored Oct 19, 2024
2 parents e82afdd + dcb7b66 commit 0dccfdb
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion pr_agent/algo/ai_handlers/litellm_ai_handler.py
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,8 @@ async def chat_completion(self, model: str, system: str, user: str, temperature:

# Currently O1 does not support separate system and user prompts
O1_MODEL_PREFIX = 'o1-'
if model.startswith(O1_MODEL_PREFIX):
model_type = model.split('/')[-1] if '/' in model else model
if model_type.startswith(O1_MODEL_PREFIX):
user = f"{system}\n\n\n{user}"
system = ""
get_logger().info(f"Using O1 model, combining system and user prompts")
Expand Down

0 comments on commit 0dccfdb

Please sign in to comment.