You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The goal of PromptBuilder is to transform prompt templates, which are strings with variables, specifically Jinja templates, and fill up those variables with values that come from other components in the pipeline or from the pipeline inputs. The output of this component is one (or more) prompts, where prompts means strings that the LLM can directly use.
PromptBuilder is not tokenizer aware: the prompt will be checked for length by the LLM component before inference. If the need arise, we may later extend the component.
Draft I/O for PromptBuilder:
@componentclassPromptBuilder:
def__init__(self, template: Union[str, Path]):
self.template=# Download the templatetemplate_variables=# extracts the variables from the template textcomponent.set_input_parameters(**{var: Anyforvarintemplate_variables})
@component.output_types(prompts=List[str])defrun(self, **kwargs):
# Render the template using the variablesreturn {"prompts": prompts}
Due to the dynamic nature of prompt templates, the PromptBuilder.run() method takes **kwargs, which contains all the variables that will be filled in the template. However, for this component to work with Canals, we need to know in advance which values this dict will contain: therefore, we need the users to specify in the __init__ of the component the template to use.
Such template names cannot be changed at runtime.
The text was updated successfully, but these errors were encountered:
See the LLM proposal: #5540
The goal of
PromptBuilder
is to transform prompt templates, which are strings with variables, specifically Jinja templates, and fill up those variables with values that come from other components in the pipeline or from the pipeline inputs. The output of this component is one (or more) prompts, where prompts means strings that the LLM can directly use.PromptBuilder
is not tokenizer aware: the prompt will be checked for length by the LLM component before inference. If the need arise, we may later extend the component.Draft I/O for
PromptBuilder
:Due to the dynamic nature of prompt templates, the
PromptBuilder.run()
method takes**kwargs
, which contains all the variables that will be filled in the template. However, for this component to work with Canals, we need to know in advance which values this dict will contain: therefore, we need the users to specify in the__init__
of the component the template to use.Such template names cannot be changed at runtime.
The text was updated successfully, but these errors were encountered: