Skip to content

Commit

Permalink
community: Add logprobs in gen output (#14826)
Browse files Browse the repository at this point in the history
Now that it's supported again for OAI chat models .

Shame this wouldn't include it in the `.invoke()` output though (it's
not included in the message itself). Would need to do a follow-up for
that to be the case
  • Loading branch information
hinthornw authored Dec 18, 2023
1 parent c316731 commit 2d91d2b
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion libs/community/langchain_community/chat_models/openai.py
Original file line number Diff line number Diff line change
Expand Up @@ -454,9 +454,12 @@ def _create_chat_result(self, response: Union[dict, BaseModel]) -> ChatResult:
response = response.dict()
for res in response["choices"]:
message = convert_dict_to_message(res["message"])
generation_info = dict(finish_reason=res.get("finish_reason"))
if "logprobs" in res:
generation_info["logprobs"] = res["logprobs"]
gen = ChatGeneration(
message=message,
generation_info=dict(finish_reason=res.get("finish_reason")),
generation_info=generation_info,
)
generations.append(gen)
token_usage = response.get("usage", {})
Expand Down

0 comments on commit 2d91d2b

Please sign in to comment.