How to import the stream_options parameter in chatopenapi of langchain 0.0.352 version. Let the large model type output token usage information. #28074
Replies: 1 comment 2 replies
-
@XiaozanZhang Hello there! I'm here to help you with any bugs, questions, or contribution assistance you might need while waiting for a human maintainer. Let's work together to tackle this! To use the from langchain.chatopenapi import YourChatModelClass
# Initialize your chat model
chat_model = YourChatModelClass(...)
# Set stream_options to include usage
stream_options = {"include_usage": True}
# Use the model with stream_options
for chunk in chat_model._stream(stream_options=stream_options):
# Process each chunk
print(chunk) This setup ensures that the streaming output includes token usage information by setting |
Beta Was this translation helpful? Give feedback.
-
How to import the stream_options parameter in chatopenapi of langchain0.0.352 version. Let the large model type output token usage information.
Beta Was this translation helpful? Give feedback.
All reactions