Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

community: Add tool_calls support for ChatTongyi Model, allowing it to be used as the LLM of the lastest tool calling agent #21366

Closed

Conversation

liushuaikobe
Copy link
Contributor

example usage:

llm = ChatTongyi(model_name="qwen-max")
tools = [TavilySearchResults(max_results=1)]
prompt = ChatPromptTemplate.from_messages(
  [
      (
          "system",
          "You are a helpful assistant. Make sure to use the tavily_search_results_json tool for information.",
      ),
      ("placeholder", "{chat_history}"),
      ("human", "{input}"),
      ("placeholder", "{agent_scratchpad}"),
  ]
)
# Construct the Tools agent
agent = create_tool_calling_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
agent_executor.invoke({"input": "what is LangChain?"})

or simply:

class GetWeather(BaseModel):
  """Get the current weather in a given location"""

  location: str = Field(
      ...,
      description="The city and state, e.g. San Francisco, CA",
  )
llm = ChatTongyi(model_name="qwen-max")
llm_with_tools = llm.bind_tools([GetWeather])
llm_with_tools.invoke("what is the weather like in HangZhou, China")
  • Dependencies: no new dependencies required.

@dosubot dosubot bot added the size:L This PR changes 100-499 lines, ignoring generated files. label May 7, 2024
Copy link

vercel bot commented May 7, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Ignored Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Visit Preview May 7, 2024 8:26am

@dosubot dosubot bot added the 🤖:improvement Medium size change to existing code to handle new use-cases label May 7, 2024
@ccurme
Copy link
Collaborator

ccurme commented May 7, 2024

Thanks for this @liushuaikobe. This appears duplicated with #20725 so I'm closing it for now. Would appreciate your review and/or testing on that PR if you can.

@ccurme ccurme closed this May 7, 2024
@liushuaikobe
Copy link
Contributor Author

@ccurme Thank you for your review and I'm also grateful for #20725!

Actually, this PR employs a more "langchain" approach and aligns better with the original design of langchain..

  1. It utilizes tool_call_chunks in AIMessageChunk for handing the streaming output of AI message with tool calls. It also defines an inner subclass of AIMessageChunk to handle the limitations of Tongyi LLM API appropriately.
  2. The utility method message_chunk_to_message from the core library is used when converting a message or message chunk into a dictionary.
  3. Explicitly passes the incremental_output parameter when calling the bind method for tool binding, clearly demonstrating the requirements of the Tongyi LLM API. Moreover, I have also included additional comments for clearer explanations and better code understanding.

Hence, this PR entails minimal code changes while fully supporting tool calling agents with the Tongyi LLM.

I would greatly appreciate it if this PR could be reopened and reviewed once again.

@ccurme
Copy link
Collaborator

ccurme commented May 16, 2024

@liushuaikobe thanks for the work you did here. We merged #20725. Would appreciate your improvements to that implementation.

@jlcoo
Copy link

jlcoo commented Jun 29, 2024

@liushuaikobe @ccurme
image
the ChatTongyi can't deal tool response?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:improvement Medium size change to existing code to handle new use-cases size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants