Skip to content

Commit

Permalink
add qianfanendpoint for langchain_intro.qmd
Browse files Browse the repository at this point in the history
  • Loading branch information
wangwei1237 committed Dec 6, 2023
1 parent e15fb43 commit 9ebdda9
Show file tree
Hide file tree
Showing 5 changed files with 82 additions and 1 deletion.
3 changes: 3 additions & 0 deletions _quarto.yml
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,9 @@ book:
- langchain_openai_assistant.qmd
- langchain_agent_fc.qmd
- langchain_agent_pae.qmd
- part: "Sematic Kernel"
chapters:
- sematickernel_intro.qmd
- part: "其他框架"
chapters:
- autogen.qmd
Expand Down
20 changes: 20 additions & 0 deletions code/test_qianfanendpoint.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
"""
@discribe: demo for the QianfanChatEndpoint.
@author: [email protected]
"""

from langchain.chat_models import QianfanChatEndpoint
from langchain.chains import LLMChain
from langchain.prompts import ChatPromptTemplate

system = "你是一个能力很强的机器人,你的名字叫 小叮当。"
prompt = ChatPromptTemplate.from_messages(
[
('system', system),
("human", "{query}"),
]
)
llm = QianfanChatEndpoint(model="ERNIE-Bot-4")
chain = LLMChain(llm=llm, prompt=prompt, verbose=True)
res = chain.run(query="你是谁?")
print(res)
19 changes: 19 additions & 0 deletions code/test_wx_qianfan.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
"""
@discribe: demo for the ErnieBotChat.
@author: [email protected]
"""

from langchain.chains import LLMChain
from langchain.chat_models import ErnieBotChat
from langchain.prompts import ChatPromptTemplate

system = "你是一个能力很强的机器人,你的名字叫 小叮当。"
prompt = ChatPromptTemplate.from_messages(
[
("human", "{query}"),
]
)
llm = ErnieBotChat(model_name="ERNIE-Bot-4", system=system)
chain = LLMChain(llm=llm, prompt=prompt, verbose=True)
res = chain.run(query="你是谁?")
print(res)
40 changes: 39 additions & 1 deletion langchain_intro.qmd
Original file line number Diff line number Diff line change
@@ -1,3 +1,9 @@
---
filters:
- include-code-files
code-annotations: below
---

# LangChain 简介

::: {.callout-tip title="要点提示"}
Expand Down Expand Up @@ -196,13 +202,45 @@ print(res)
# additional_kwargs={} example=False
```

#### 文心 4.0 {.unnumbered}
#### 文心 4.0
在 LangChain 中,要使用 文心 4.0 模型,可以在初始化 LLM 时设置 `model_name` 参数为 `ERNIE-Bot-4`

```python
llm = ErnieBotChat(model_name="ERNIE-Bot-4")
```

#### 百度千帆
根据 LangChain 官网的 [ErnieBotChat 文档](https://python.langchain.com/docs/integrations/chat/ernie),已经不建议再使用 `ErnieBotChat` 进行文心大模型的调用,并且建议使用 [百度千帆 `QianfanChatEndpoint`](https://python.langchain.com/docs/integrations/chat/baidu_qianfan_endpoint)

建议使用百度千帆(`QianfanChatEndpoint`)主要基于如下的因素:

* QianfanChatEndpoint 支持千帆平台中的更多LLM
* QianfanChatEndpoint 支持 Stream 传输
* QianfanChatEndpoint 支持函数调用

但是,除了 Stream 传输外,其余的两个优势目前 `ErnieBotChat` 也都具备了,并且 `ErnieBotChat` 的优点还在于不需要引入额外的 `qianfan` 库。所以在二者的使用上大家根据自己的具体需求来选择就好。

在使用 `QianfanChatEndpoint` 时,需要将 `ernie_client_id` 改为 `qianfan_ak`,把 `ernie_client_secret` 改为 `qianfan_sk`

```bash
export ERNIE_CLIENT_ID="……"
export ERNIE_CLIENT_SECRET="……"
export QIANFAN_AK="${ERNIE_CLIENT_ID}"
export QIANFAN_SK="${ERNIE_CLIENT_SECRET}"
```

::: {.panel-tabset group="ernie_and_qianfan"}
## QianfanChatEndpoint

```{#lst-lc_intro_qianfan .python include="./code/test_qianfanendpoint.py" code-line-numbers="true" lst-cap="使用 QianfanChatEndpoint 调用文心大模型"}
```

## ErnieBotChat
```{#lst-lc_intro_wx_qianfan .python include="./code/test_wx_qianfan.py" code-line-numbers="true" lst-cap="使用 ErnieBotChat 调用文心大模型"}
```

:::

### Output Parsers
大语言模型一般会输出文本内容作为响应,当然更高级的大语言模型(例如文心大模型)还可以输出图片、视频作为响应。但是,很多时候,我们希望可以获得更结构化的信息,而不仅仅是回复一串字符串文本。

Expand Down
1 change: 1 addition & 0 deletions sematickernel_intro.qmd
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# Sematic Kernel 简介

0 comments on commit 9ebdda9

Please sign in to comment.