Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(idp-extraction-connector): added compatibility with llms lacking system message support #3578

Conversation

sahilbhatoacamunda
Copy link
Contributor

Description

  • Added ability to have different system prompts based on modelId input.
  • Refactored the call to AWS Bedrock to include a system message conditionally only for LLMs that support it.

Related issues

closes #3572

Checklist

  • PR has a milestone or the no milestone label.

@sahilbhatoacamunda sahilbhatoacamunda added this pull request to the merge queue Nov 5, 2024
Merged via the queue into main with commit b05e119 Nov 5, 2024
12 of 13 checks passed
@sahilbhatoacamunda sahilbhatoacamunda deleted the add-compatibility-with-llms-lacking-system-message-support branch November 5, 2024 15:28
mathias-vandaele pushed a commit that referenced this pull request Nov 8, 2024
… system message support (#3578)

* feat(idp-extraction-connector): added compatibility with llms lacking system message support

* feat(idp-extraction-connector): moved system_prompt_variable_template to llm model

* feat(idp-extraction-connector): added vendor to llm model
sbuettner pushed a commit that referenced this pull request Dec 5, 2024
… system message support (#3578)

* feat(idp-extraction-connector): added compatibility with llms lacking system message support

* feat(idp-extraction-connector): moved system_prompt_variable_template to llm model

* feat(idp-extraction-connector): added vendor to llm model
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Enhance IDP Camunda connector for compatibility with LLMs lacking system message support
2 participants