Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Official Anthropic function calling support? #65

Closed
austinmw opened this issue May 30, 2024 · 11 comments · Fixed by #70
Closed

Official Anthropic function calling support? #65

austinmw opened this issue May 30, 2024 · 11 comments · Fixed by #70

Comments

@austinmw
Copy link

Anthropic announced GA support for tools today (5/30): https://www.anthropic.com/news/tool-use-ga

Is this meant to be supported? I saw some PRs mentioned related to tool use, but the following code returns an empty list:

from langchain_aws import ChatBedrock  # langchain-aws v0.1.4 (latest)
from langchain_core.messages import HumanMessage
from langchain_community.tools import DuckDuckGoSearchRun

llm = ChatBedrock(
    model_id="anthropic.claude-3-sonnet-20240229-v1:0",
    region_name="us-east-1",
)

search = DuckDuckGoSearchRun()

tools = [search]

# llm with tools
llm_with_tools = llm.bind_tools(tools)

messages = [
    ("system", "You are a helpful assistant."),
    ("human", "What is LangChain Tool Calling?")
]

llm_with_tools.invoke(messages).tool_calls

Output:

[]
@austinmw
Copy link
Author

@laithalsaadoon
Copy link
Contributor

@3coins what are your thoughts on adding Converse API support? Would we want a different class like ConverseBedrock maybe?

@austinmw
Copy link
Author

austinmw commented May 31, 2024

Ideally the implementation would allow for traditional LangChain tools to be passed, and they could be translated to toolSpec. For example, I hope this will work in the future implementation:

from langchain_core.tools import tool
from langchain_community.tools import DuckDuckGoSearchRun

@tool
def calculator(a: int, b: int) -> int:
    """Perform multiplication."""
    return a * b

llm = ChatBedrock(  # or ConverseBedrock if necessary
    model_id="anthropic.claude-3-sonnet-20240229-v1:0",
    region_name="us-east-1",
)

tools = [search, calculator]

# Bind tools to the LLM
llm_with_tools = llm.bind_tools(tools)


messages = [
    #HumanMessage(content="What is the result of 42 multiplied by 3?"),  # should use calculator
    HumanMessage(content="What is today's date?"),  # should use search
]

result = llm_with_tools.invoke(messages)

@thiagotps
Copy link
Contributor

thiagotps commented May 31, 2024

Function calling works with the invoke_model method too. At least for claude.

modelId = "anthropic.claude-3-haiku-20240307-v1:0"
response = bedrock.invoke_model(
    body=json.dumps(
{
    "anthropic_version": "bedrock-2023-05-31",
    "max_tokens": 1024,
    "tools": [{
        "name": "get_weather",
        "description": "Get the current weather in a given location",
        "input_schema": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "The city and state, e.g. San Francisco, CA"
                },
                "unit": {
                    "type": "string",
                    "enum": ["celsius", "fahrenheit"],
                    "description": "The unit of temperature, either \"celsius\" or \"fahrenheit\""
                }
            },
            "required": ["location"]
        }
    }],
    "messages": [{"role": "user", "content": "What is the weather like in San Francisco?"}]
}
),
    modelId=modelId
)
json.loads(response.get("body").read())
{'id': 'msg_bdrk_01H4yz4vJzPiLE8rhwKLhnMB',
 'type': 'message',
 'role': 'assistant',
 'model': 'claude-3-haiku-20240307',
 'stop_sequence': None,
 'usage': {'input_tokens': 393, 'output_tokens': 73},
 'content': [{'type': 'tool_use',
   'id': 'toolu_bdrk_01QSbfcGUp7mtAbqixSR7YR9',
   'name': 'get_weather',
   'input': {'location': 'San Francisco, CA', 'unit': 'celsius'}}],
 'stop_reason': 'tool_use'}

@AlinJiang
Copy link

Quote from https://aws.amazon.com/about-aws/whats-new/2024/05/amazon-bedrock-new-converse-api/

The Converse API provides a consistent experience that works with Amazon Bedrock models, removing the need for developers to manage any model-specific implementation. With this API, you can write a code once and use it seamlessly with different models on Amazon Bedrock.

Since the new Converse API provides the long-waited unified interface for all bedrock models and official tool use support, it makes sense to switch from invoke API to converse API as soon as possible.

@AlinJiang
Copy link

In the new implementation on top of Converse API, we should also expose the complete response payload, including metadata like "stop_reason", such info becomes crucial now when working with function calling and agents.

@thiagotps
Copy link
Contributor

@3coins what are your thoughts on adding Converse API support? Would we want a different class like ConverseBedrock maybe?

Given that there is a on going pull request implementing function calling via invoke_model, I also think that it is a good idea to have a separate class for the converse method.

@adamdbrw
Copy link

I found this issue as we want to use Bedrock models for our langchain work for robotics, so switching to converse API would be perfect. Right now it is more convenient to work with OpenAI.

@austinmw
Copy link
Author

@baskaryan Not sure why this was closed? Does #70 use the new function calling support? I thought it was using XML parsing?

@yingzwang
Copy link

#70 is still based on the legacy tool use of Claude, which is not recommended by Anthropic and does not work well with Claude 3.

@zhichenggeng
Copy link

@austinmw @yingzwang Correct me if I'm wrong. This code snippet should be using the official function calling from claude. If it is not claude 3 then it will use the old XML parsing way.

if "claude-3" in self._get_model():
if not tool_choice:
pass
elif isinstance(tool_choice, dict):
kwargs["tool_choice"] = tool_choice
elif isinstance(tool_choice, str) and tool_choice in ("any", "auto"):
kwargs["tool_choice"] = {"type": tool_choice}
elif isinstance(tool_choice, str):
kwargs["tool_choice"] = {"type": "tool", "name": tool_choice}
else:
raise ValueError(
f"Unrecognized 'tool_choice' type {tool_choice=}."
f"Expected dict, str, or None."
)
return self.bind(tools=formatted_tools, **kwargs)
else:
# add tools to the system prompt, the old way
system_formatted_tools = get_system_message(formatted_tools)
self.set_system_prompt_with_tools(system_formatted_tools)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants