Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
feat: 1. Add system parameters, 2. Align with the QianfanChatEndpoint…
… for function calling (#14275) - **Description:** 1. Add system parameters to the ERNIE LLM API to set the role of the LLM. 2. Add support for the ERNIE-Bot-turbo-AI model according from the document https://cloud.baidu.com/doc/WENXINWORKSHOP/s/Alp0kdm0n. 3. For the function call of ErnieBotChat, align with the QianfanChatEndpoint. With this PR, the `QianfanChatEndpoint()` can use the `function calling` ability with `create_ernie_fn_chain()`. The example is as the following: ``` from langchain.prompts import ChatPromptTemplate import json from langchain.prompts.chat import ( ChatPromptTemplate, ) from langchain.chat_models import QianfanChatEndpoint from langchain.chains.ernie_functions import ( create_ernie_fn_chain, ) def get_current_news(location: str) -> str: """Get the current news based on the location.' Args: location (str): The location to query. Returs: str: Current news based on the location. """ news_info = { "location": location, "news": [ "I have a Book.", "It's a nice day, today." ] } return json.dumps(news_info) def get_current_weather(location: str, unit: str="celsius") -> str: """Get the current weather in a given location Args: location (str): location of the weather. unit (str): unit of the tempuature. Returns: str: weather in the given location. """ weather_info = { "location": location, "temperature": "27", "unit": unit, "forecast": ["sunny", "windy"], } return json.dumps(weather_info) template = ChatPromptTemplate.from_messages([ ("user", "{user_input}"), ]) chat = QianfanChatEndpoint(model="ERNIE-Bot-4") chain = create_ernie_fn_chain([get_current_weather, get_current_news], chat, template, verbose=True) res = chain.run("北京今天的新闻是什么?") print(res) ``` The result of the above code: ``` > Entering new LLMChain chain... Prompt after formatting: Human: 北京今天的新闻是什么? > Finished chain. {'name': 'get_current_news', 'arguments': {'location': '北京'}} ``` For the `ErnieBotChat`, now can use the `system` parameter to set the role of the LLM. ``` from langchain.prompts import ChatPromptTemplate from langchain.chains import LLMChain from langchain.chat_models import ErnieBotChat llm = ErnieBotChat(model_name="ERNIE-Bot-turbo-AI", system="你是一个能力很强的机器人,你的名字叫 小叮当。无论问你什么问题,你都可以给出答案。") prompt = ChatPromptTemplate.from_messages( [ ("human", "{query}"), ] ) chain = LLMChain(llm=llm, prompt=prompt, verbose=True) res = chain.run(query="你是谁?") print(res) ``` The result of the above code: ``` > Entering new LLMChain chain... Prompt after formatting: Human: 你是谁? > Finished chain. 我是小叮当,一个智能机器人。我可以为你提供各种服务,包括回答问题、提供信息、进行计算等。如果你需要任何帮助,请随时告诉我,我会尽力为你提供最好的服务。 ```
- Loading branch information