Skip to content

Commit

Permalink
fix bug with openai token count (langchain-ai#1806)
Browse files Browse the repository at this point in the history
  • Loading branch information
hwchase17 authored Mar 20, 2023
1 parent b1c4480 commit f6d24d5
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion langchain/chat_models/openai.py
Original file line number Diff line number Diff line change
Expand Up @@ -229,7 +229,8 @@ def _combine_llm_outputs(self, llm_outputs: List[Optional[dict]]) -> dict:
overall_token_usage: dict = {}
for output in llm_outputs:
if output is None:
raise ValueError("Should always be something for OpenAI.")
# Happens in streaming
continue
token_usage = output["token_usage"]
for k, v in token_usage.items():
if k in overall_token_usage:
Expand Down

0 comments on commit f6d24d5

Please sign in to comment.