Skip to content

Commit

Permalink
community: Add token_usage and model_name metadata to ChatZhipuAI str…
Browse files Browse the repository at this point in the history
…eam() and astream() response (#27677)

Thank you for contributing to LangChain!


- **Description:** Add token_usage and model_name metadata to
ChatZhipuAI stream() and astream() response
- **Issue:** None
- **Dependencies:** None
- **Twitter handle:** None


- [ ] **Add tests and docs**: If you're adding a new integration, please
include
1. a test for the integration, preferably unit tests that do not rely on
network access,
2. an example notebook showing its use. It lives in
`docs/docs/integrations` directory.


- [ ] **Lint and test**: Run `make format`, `make lint` and `make test`
from the root of the package(s) you've modified. See contribution
guidelines for more: https://python.langchain.com/docs/contributing/

Additional guidelines:
- Make sure optional dependencies are imported within a function.
- Please do not add dependencies to pyproject.toml files (even optional
ones) unless they are required for unit tests.
- Most PRs should not touch more than one package.
- Changes should be backwards compatible.
- If you are adding something to community, do not re-import it in
langchain.

If no one reviews your PR within a few days, please @-mention one of
baskaryan, efriis, eyurtsev, ccurme, vbarda, hwchase17.

Co-authored-by: jianfehuang <[email protected]>
  • Loading branch information
suifengfengye and jianfehuang authored Oct 30, 2024
1 parent 8a5807a commit 18cfb4c
Showing 1 changed file with 14 additions and 2 deletions.
16 changes: 14 additions & 2 deletions libs/community/langchain_community/chat_models/zhipuai.py
Original file line number Diff line number Diff line change
Expand Up @@ -591,13 +591,19 @@ def _stream(
if len(chunk["choices"]) == 0:
continue
choice = chunk["choices"][0]
usage = chunk.get("usage", None)
model_name = chunk.get("model", "")
chunk = _convert_delta_to_message_chunk(
choice["delta"], default_chunk_class
)
finish_reason = choice.get("finish_reason", None)

generation_info = (
{"finish_reason": finish_reason}
{
"finish_reason": finish_reason,
"token_usage": usage,
"model_name": model_name,
}
if finish_reason is not None
else None
)
Expand Down Expand Up @@ -678,13 +684,19 @@ async def _astream(
if len(chunk["choices"]) == 0:
continue
choice = chunk["choices"][0]
usage = chunk.get("usage", None)
model_name = chunk.get("model", "")
chunk = _convert_delta_to_message_chunk(
choice["delta"], default_chunk_class
)
finish_reason = choice.get("finish_reason", None)

generation_info = (
{"finish_reason": finish_reason}
{
"finish_reason": finish_reason,
"token_usage": usage,
"model_name": model_name,
}
if finish_reason is not None
else None
)
Expand Down

0 comments on commit 18cfb4c

Please sign in to comment.