Replies: 6 comments 13 replies
-
🤖 Hey @ldorigo! Great to see you diving deep into the intricacies of LangChain again. Looking forward to tackling this puzzle together. 🚀 Based on your code and the problem you're facing, it seems like you want to pass an additional argument ( One way to handle this is to include the class SearchKBToolInput(BaseModel):
query: str = Field(default="The contextualized question to search for in the knowledge base.")
assistant_id: Optional[str] = Field(default=None) Then, in your async def _arun(self, query: str, assistant_id: Optional[str] = None, **kwargs) -> SearchKBToolOutput:
if assistant_id is None:
assistant_id = "default_assistant_id" # replace with your default value
result= await self.retrieval_chain.ainvoke(
{"query": query},
{"configurable": {"search_kwargs": {"namespace": assistant_id}}}
)
return result
def _run(self, query: str, assistant_id: Optional[str] = None, **kwargs) -> SearchKBToolOutput:
if assistant_id is None:
assistant_id = "default_assistant_id" # replace with your default value
result = self.retrieval_chain.invoke(
{"query": query},
{"configurable": {"search_kwargs": {"namespace": assistant_id}}}
)
return result This way, you can pass the Please note that this solution assumes that you're okay with having a default value for I hope this helps! If you have any other questions or need further clarification, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
One way to do this is to create the tool dynamically: def create_my_tool(my_arg) -> Tool:
@tool # <-- Use tool decorator to make your life easier
def tool_name(arg1, arg2, ..):
""" Tool Description"""
... use my_arg, arg1, arg2 ... |
Beta Was this translation helpful? Give feedback.
-
How will you do this with an agent? |
Beta Was this translation helpful? Give feedback.
-
I found an easy general solution - just mark the pydantic model with from langchain.pydantic_v1 import BaseModel
from langchain.pydantic_v1 import Extra
from langchain.pydantic_v1 import Field
class MyToolInput(BaseModel):
query: str = Field(default="The query")
class Config:
extra = Extra.allow |
Beta Was this translation helpful? Give feedback.
-
how to pass assistant_id across all tools in Langgraph |
Beta Was this translation helpful? Give feedback.
-
This thread is a bit older but still pops up when people search for this topic in search engines. I want to add my two cents, because the built-in solution of langchain is not mentioned here: This is how from langchain_core.tools import tool, InjectedToolArg
class CalculatorInput(BaseModel):
expression: str = Field(description="The math expression to calculate'")
auth_token: Annotated[str, InjectedToolArg] # this will not be included in the tool schema
@tool(args_schema=CalculatorInput)
def calculator(expression: str, auth_token: Annotated[str, InjectedToolArg]):
"""Use this tool to calculate an expression (e.g. '1+1')."""
... In the background, LangChain will keep the argument for the input schema: >>> calculator.get_input_schema().schema()
{
'properties': {
'expression': {
'description': "The math expression to calculate",
'title': 'Expression',
'type': 'string'
},
'auth_token': {'title': 'Auth Token', 'type': 'string'}
},
'required': ['expression', 'auth_token'],
'title': 'CalculatorInput',
'type': 'object'
} but hide it from the LLM in the tool call schema: >>> calculator.tool_call_schema.schema()
{
'description': "Use this tool to calculate an expression (e.g. '1+1').",
'properties': {
'expression': {
'description': "The math expression to calculate",
'title': 'Expression',
'type': 'string'
}
},
'required': ['expression'],
'title': 'calculator',
'type': 'object'
} As a result, the additional argument I hope this helps anyone looking for a solution post August 2024. |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
I followed the examples from langgraph and in the docs about "tools", but I can't figure out how to let some input parameters of the tool be set by hand by the others need to be filled by the LLM.
I have an
assistant_id
parameter that determines the pinecone namespace and I want to pass it manually to the tool when I invoke it. If i make the Input model something like:Then the LLM will see the
assistant_id
argument and try to generate it.However if I don't include it in the input model, the method
langchain_core.tool.BaseTool._parse_input
discards any parameter that I pass to the tool. How should I do it? This is how I invoke the tool:System Info
System Information
Package Information
Packages not installed (Not Necessarily a Problem)
The following packages were not found:
Beta Was this translation helpful? Give feedback.
All reactions