Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

community: Implement bind_tools for ChatTongyi #20725

Merged
merged 20 commits into from
May 16, 2024

Conversation

cheese-git
Copy link
Contributor

@cheese-git cheese-git commented Apr 22, 2024

Description

Implement bind_tools in ChatTongyi. Usage example:

from langchain_core.tools import tool
from langchain_community.chat_models.tongyi import ChatTongyi

@tool
def multiply(first_int: int, second_int: int) -> int:
    """Multiply two integers together."""
    return first_int * second_int

llm = ChatTongyi(model="qwen-turbo")

llm_with_tools = llm.bind_tools([multiply])

msg = llm_with_tools.invoke("What's 5 times forty two")

print(msg)

Streaming is also supported.

Dependencies

No Dependency is required for this change.

@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. 🤖:improvement Medium size change to existing code to handle new use-cases labels Apr 22, 2024
Copy link

vercel bot commented Apr 22, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchain ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 7, 2024 5:22pm

AIMessageChunk(
content=content,
additional_kwargs=additional_kwargs,
tool_calls=tool_calls,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are these tool calls fully formed in a streaming context?

if not you can specify tool_call_chunks on AIMessageChunk instead, with args a (partial json) string. See example here:

tool_call_chunks=tool_call_chunks,

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot, I have made some updates about this.

return (
AIMessage(
content=message_chunk.content,
tool_calls=message_chunk.additional_kwargs["tool_calls"],
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is confusing, because in convert_dict_to_message we are parsing tool calls (e.g., in call to parse_tool_call) but here we are passing them in although they are already parsed.

at this point, is the following true?

if message_chunk.additional_kwargs["tool_calls"]:
    item = message_chunk.additional_kwargs["tool_calls"][0]

assert isinstance(item["args"], dict)  # not a string

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for delayed response, having a really tough time with my job 😫
This was a mistake, I've rewritten the method referring langchain_openai's base module.

@cheese-git
Copy link
Contributor Author

I also did some refactoring.

The main challenge of supporting tools for Tongyi, is that Tongyi does not support using tools with incremental_output=True yet, which means every new chunk contains the content of previous chunks. However, some internal methods of Langchain assume the opposite.

That's why I end up writing a subtract_client_response method, to chop off duplicated content when streaming without incremental_output=True.

@cheese-git cheese-git requested a review from ccurme April 29, 2024 15:20
else:
delta_resp = self.subtract_client_response(resp, prev_resp)
prev_resp = resp
yield check_response(delta_resp)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cheese-git would this represent a breaking change for streaming generally?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, it would not.

Before this PR, incremental_output is always True when streaming:

if params.get("stream"):
params["incremental_output"] = True

To support tools when streaming, I updated the condition to this:

# According to the Tongyi official docs,
# `incremental_output` with `tools` is not supported yet
if params.get("stream") and not params.get("tools"):
    params["incremental_output"] = True

So "streaming without incremental_output being True" is a new circumstance.

@cheese-git cheese-git requested a review from ccurme May 14, 2024 08:37
@ccurme ccurme merged commit 0ead09f into langchain-ai:master May 16, 2024
62 checks passed
@Arunshmily
Copy link

I Use ChatTongyi to bulid a Agent.
Which outputparser I could use?
ToolsAgentOutputParser() could not available to ChatTongyi.
LangChain-core==0.2.1

hinthornw pushed a commit that referenced this pull request Jun 20, 2024
## Description

Implement `bind_tools` in ChatTongyi. Usage example:

```py
from langchain_core.tools import tool
from langchain_community.chat_models.tongyi import ChatTongyi

@tool
def multiply(first_int: int, second_int: int) -> int:
    """Multiply two integers together."""
    return first_int * second_int

llm = ChatTongyi(model="qwen-turbo")

llm_with_tools = llm.bind_tools([multiply])

msg = llm_with_tools.invoke("What's 5 times forty two")

print(msg)
```

Streaming is also supported.

## Dependencies

No Dependency is required for this change.

---------

Co-authored-by: Bagatur <[email protected]>
Co-authored-by: Chester Curme <[email protected]>
@jlcoo
Copy link

jlcoo commented Jun 29, 2024

from langchain_core.tools import tool
from langchain_community.chat_models.tongyi import ChatTongyi

@tool
def multiply(first_int: int, second_int: int) -> int:
    """Multiply two integers together."""
    return first_int * second_int

llm = ChatTongyi(model="qwen-turbo")

llm_with_tools = llm.bind_tools([multiply])

msg = llm_with_tools.invoke("What's 5 times forty two")

print(msg)

this code can't call multiply in langchain 0.2.6?
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:improvement Medium size change to existing code to handle new use-cases size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants