Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to use custom tool with ENUM field #54

Closed
SteelValkyrie opened this issue Mar 8, 2024 · 2 comments
Closed

Unable to use custom tool with ENUM field #54

SteelValkyrie opened this issue Mar 8, 2024 · 2 comments

Comments

@SteelValkyrie
Copy link

I have an AI Agent with an OpenAI LLM using LCEL, I'm trying to replace the LLM with Gemini Pro but both ChatVertexAI and ChatGoogleGenerativeAI are unable to convert a custom tool I have, and that previously worked fine with OpenAI Chat, apparently it errors when it tries to get the parameters from the schema since one of the input fields is an ENUM type.

The error that I get is:

.venv\Lib\site-packages\langchain_google_genai\_function_utils.py", line 74, in <dictcomp>
    "type_": TYPE_ENUM[v["type"]],
                       ~^^^^^^^^
KeyError: 'type'

Note that

I'm assuming the problem is that when _convert_tool_to_genai_function is running and tries to define the parameters for the FunctionDeclaration looping through the schema properties it tries to access they "type" Key which the ENUM field doesn't have directly accessible and instead has a "allOf" Key that then contains those values.

Here's my code.
First I initialize the LLM for my Agent and I bind the tools to it:

llm = ChatGoogleGenerativeAI(  # type: ignore
    model="gemini-1.0-pro-latest",
    temperature=0,
    google_api_key=GOOGLE_API_KEY,
    convert_system_message_to_human=True,
    verbose=True,
)

llm_with_tools = llm.bind(functions=tools)

My custom tool looks kinda like this:

class LogTool(BaseTool):
    "Tool for generating a log"

    name = "generate_log"
    description = """
            Useful when you want to generate a log. 
            P = Present,
            A = Person was absent,
            H = Holiday,
            TN = You do not have your Temporary person for the day.
            """

    args_schema: Type[BaseModel] = Log

    @staticmethod
    def generate_log(name: str, status: str):
        "Create log data"

        attendance_log = {
            "name": name,
            "date": datetime.now().strftime("%m/%d/%Y"),
            "status": status,
        }

        return [attendance_log]

    def _run(self, name: str, status: str):
        response = self.generate_log(name, status)
        return response

The BaseModel for the input schema looks like this:

class Log(BaseModel):
    "Input data for generating a log"

    name: str = Field(description="Name of the person")
    status: StatusEnum = Field(description="Status of the log for the person")

And the ENUM:

class StatusEnum(str, Enum):
    "Enum for status of the log: P, A, H, TN"

    P = "Present"
    A = "Absent"
    H = "Holiday"
    TN = "Temporary person"
@lkuligin
Copy link
Collaborator

@SteelValkyrie could you provide a fully reproducible snippet, please?

But most probably, the recent change #56 should solve your problem, at least I tested with your tool and got a valid response

AIMessage(content='', additional_kwargs={'function_call': {'name': 'generate_log', 'arguments': '{"__arg1": "P"}'}})
```.

@lkuligin
Copy link
Collaborator

closing this for now, it should be fixed with the 0.0.11. Please, feel free to re-open the issue if the problem is still there, and share a reproducible snippet!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants