Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parallel tool/function calling is not supported #187

Closed
Puzer opened this issue Nov 19, 2024 · 1 comment
Closed

Parallel tool/function calling is not supported #187

Puzer opened this issue Nov 19, 2024 · 1 comment

Comments

@Puzer
Copy link

Puzer commented Nov 19, 2024

The provided code snippet works fine if LLM (v3-5-sonnet) decides to use only one function, but failing if LLM is calling two functions.
This snippet also works fine if I use gpt-4o model for parallel function calls.

import json
from langchain_core.tools import tool
from langchain_core.utils.function_calling import convert_to_openai_tool
from openai import AzureOpenAI

query = "What is 3 * 12 and 6 * 2?" # parallel FC use-case <- failing
# query = "What is 3 * 12?" # a regular FC <- works fine

@tool
def add(a: int, b: int) -> int:
    """Add two integers.

    Args:
        a: First integer
        b: Second integer
    """
    return a + b

@tool
def multiply(a: int, b: int) -> int:
    """Multiply two integers.

    Args:
        a: First integer
        b: Second integer
    """
    return a * b

tools = [add, multiply]
available_functions = {t.name:t.func for t in tools}
tools_openai_schema = [convert_to_openai_tool(t) for t in tools]

messages = [{"role": "user", "content": query}]


client = AzureOpenAI(
    api_key="XXX",  
    api_version="2023-03-15-preview",
    azure_endpoint="XXX"
)

model_name = "anthropic.claude-v3-5-sonnet"
completion = client.chat.completions.create(
    model=model_name,
    messages=messages,
    tools=tools_openai_schema
)

response_message = completion.choices[0].message
tool_calls = response_message.tool_calls
assert tool_calls

response_message.content = None# instead of empty string (raises error) convert that to None
messages.append(response_message.to_dict())


# Parallel function calling
# https://platform.openai.com/docs/guides/function-calling/configuring-parallel-function-calling

for tool_call in tool_calls:
    function_name = tool_call.function.name
    function_to_call = available_functions[function_name]
    function_args = json.loads(tool_call.function.arguments)
    function_response = function_to_call(**function_args)

    message_to_append = {
            "tool_call_id": tool_call.id,
            "role": "tool",
            "name": function_name,
            "content": str(function_response),
        }
    messages.append(message_to_append)  # extend conversation with function response, as in docs

completion = client.chat.completions.create(
    model=model_name,
    messages=messages,
    tools=tools_openai_schema
)

print(completion.choices[0].message.content)

I'm getting error:

Error code: 400 - {'error': {'message': 'Error code: 400 - {\'message\': "messages.2: the following `tool_use` ids were not found in `tool_result` blocks: {\'...\'}"}', 'type': 'invalid_request_error', 'code': '400'}}
@Puzer
Copy link
Author

Puzer commented Nov 19, 2024

Seems that I tested outdated version of adapter.
Issue resolved some time ago.

If Sonnet V2 is not using functions in parallel and uses them sequentially, one-by-one, but you would like to have them in parallel (in one LLM call), just add to the System Message something like "Use function calls in parallel whereas possible".

@Puzer Puzer closed this as completed Nov 19, 2024
@github-project-automation github-project-automation bot moved this to Done in AI DIAL Nov 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

No branches or pull requests

1 participant