-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
proposal: LLM support in Haystack 2.0 #5540
Conversation
Thanks @ZanSara, this is huge! I like that the new LLM capabilities should provide a clean abstraction. Some comments added to make sure that the new LLM integration provides the capabilities that are needed in practice. |
|
||
Note how the component takes a list of prompts and LLM parameters only, but no variables nor templates, and returns only strings. This is because input rendering and output parsing are delegated to separate components, which description follows. | ||
|
||
Note: whether LLM components accept multiple prompts or a single one depends only on whether we want the LLM to support batching of prompts. Therefore it's an implementation decision that will be evaluated once we know the internals of the component. We strive to keep the interfaces as similar as possible to ease switching the various LLMs, but we won't force identical interfaces over them where it doesn't make sense with respect to their internal implementation. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Modified in response to https://github.com/deepset-ai/haystack/pull/5540/files#r1290713149
Just a comment to remind us of this... Both for completion models and chat completion models, somewhere the prompt should be adapted from simple text to text containing special tokens (e.g., Reference: EasyLLM |
Related Issues