You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have an AI Agent with an OpenAI LLM using LCEL, I'm trying to replace the LLM with Gemini Pro but both ChatVertexAI and ChatGoogleGenerativeAI are unable to convert a custom tool I have, and that previously worked fine with OpenAI Chat, apparently it errors when it tries to get the parameters from the schema since one of the input fields is an ENUM type.
The error that I get is:
.venv\Lib\site-packages\langchain_google_genai\_function_utils.py", line 74, in <dictcomp>
"type_": TYPE_ENUM[v["type"]],
~^^^^^^^^
KeyError: 'type'
Note that
I'm assuming the problem is that when _convert_tool_to_genai_function is running and tries to define the parameters for the FunctionDeclaration looping through the schema properties it tries to access they "type" Key which the ENUM field doesn't have directly accessible and instead has a "allOf" Key that then contains those values.
Here's my code.
First I initialize the LLM for my Agent and I bind the tools to it:
class LogTool(BaseTool):
"Tool for generating a log"
name = "generate_log"
description = """
Useful when you want to generate a log.
P = Present,
A = Person was absent,
H = Holiday,
TN = You do not have your Temporary person for the day.
"""
args_schema: Type[BaseModel] = Log
@staticmethod
def generate_log(name: str, status: str):
"Create log data"
attendance_log = {
"name": name,
"date": datetime.now().strftime("%m/%d/%Y"),
"status": status,
}
return [attendance_log]
def _run(self, name: str, status: str):
response = self.generate_log(name, status)
return response
The BaseModel for the input schema looks like this:
class Log(BaseModel):
"Input data for generating a log"
name: str = Field(description="Name of the person")
status: StatusEnum = Field(description="Status of the log for the person")
And the ENUM:
class StatusEnum(str, Enum):
"Enum for status of the log: P, A, H, TN"
P = "Present"
A = "Absent"
H = "Holiday"
TN = "Temporary person"
The text was updated successfully, but these errors were encountered:
closing this for now, it should be fixed with the 0.0.11. Please, feel free to re-open the issue if the problem is still there, and share a reproducible snippet!
I have an AI Agent with an OpenAI LLM using LCEL, I'm trying to replace the LLM with Gemini Pro but both
ChatVertexAI
andChatGoogleGenerativeAI
are unable to convert a custom tool I have, and that previously worked fine with OpenAI Chat, apparently it errors when it tries to get the parameters from the schema since one of the input fields is an ENUM type.The error that I get is:
Note that
I'm assuming the problem is that when
_convert_tool_to_genai_function
is running and tries to define the parameters for theFunctionDeclaration
looping through the schema properties it tries to access they "type" Key which the ENUM field doesn't have directly accessible and instead has a "allOf" Key that then contains those values.Here's my code.
First I initialize the LLM for my Agent and I bind the tools to it:
My custom tool looks kinda like this:
The BaseModel for the input schema looks like this:
And the ENUM:
The text was updated successfully, but these errors were encountered: