-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: log wrap_openai runs with unified usage_metadata #1071
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These test files will be synced over to langsmith repo for test data
python/langsmith/wrappers/_openai.py
Outdated
@@ -160,12 +167,59 @@ def _reduce_completions(all_chunks: List[Completion]) -> dict: | |||
return d | |||
|
|||
|
|||
def _create_usage_metadata(oai_token_usage: dict) -> UsageMetadata: | |||
input_tokens = oai_token_usage.get("prompt_tokens", 0) | |||
output_tokens = oai_token_usage.get("completion_tokens", 0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
would you ever get "None" as a value here? Maybe better to do "or 0" to avoid a None + None scenario?
@@ -891,3 +891,64 @@ class PromptSortField(str, Enum): | |||
"""Last updated time.""" | |||
num_likes = "num_likes" | |||
"""Number of likes.""" | |||
|
|||
|
|||
class InputTokenDetails(TypedDict, total=False): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
unrelated to this pr — wonder if langchain-core should import these from langsmith sdk now
No description provided.