Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: enable custom templating for llm prompts #528

Open
wants to merge 2 commits into
base: llm_ops_v2
Choose a base branch
from

Conversation

muhammed-shihebi
Copy link
Collaborator

What does this PR do?

This PR makes it possible to use the templates of each llm to format chat messages and create a prompt suitable for each specific model.
A list of different kinds of messages will come from the frontend and the new code will automatically format them in a prompt suitable for the chosen llm.
Note that prompting template must be already added to the llm-ops.llm-ops.prompts.conversation.py file.

Who can review?

@Rachneet

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant