-
Notifications
You must be signed in to change notification settings - Fork 379
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bug: Duplicate event log to Langfuse trace #1872
Comments
Thanks for reporting this. Do you also use the openai sdk wrapper within your application? i.e. |
Note: this might be addressed when switching to instrumentation: #1931
This'd be super helpful to know here |
Hi @marcklingen, reporting in that I do see the same duplication of traces when using the observe decorator with the openai sdk wrapper. Is this expected behaviour? Anyhow, simple enough to use OpenAI's own sdk and setting up a function to log the token usage and cost. |
Hi @marcklingen, thank you for your reply. To answer your question, yes I do use the SDK wrapper |
Describe the bug
I wanted to use LlamaParse for parsing a set of documents (PDF/Doc/Docx) and index them to be able to ask custom questions to those documents. I performed a bunch of tests where I simply parse documents using LlamaParse and perform indexing step with LlamaIndex, I encountered duplication in token count (duplicated event log with LLM generation).
To reproduce
Additional information
Focus on red rectangles - marked duplicated events openai_llm and OpenAI-generation. This is problematic in terms of token usage estimation using Langfuse. From OpenAI side token usage is only logged once, so Langfuse is logging additional event with the same query and token usage.
Here is a screen shot from Langfuse UI with trace. As you can see
openai_llm
andOpenAI-generation
event are duplicated.The text was updated successfully, but these errors were encountered: