Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐛 Bug Report: Gemini (google generativeai) instrumentation sets the input message attributes incorrectly #2419

Open
1 task done
dinmukhamedm opened this issue Dec 19, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@dinmukhamedm
Copy link
Contributor

Which component is this bug for?

All Packages

📜 Description

The attribute for gen_ai.prompt.N.* is not set correctly. Yes, I know we will soon/eventually move to sending these as events, but for now, this is an error.

👟 Reproduction steps

import google.generativeai as genai
import os
from traceloop.sdk import Traceloop

Traceloop.init()

genai.configure(api_key=os.environ['GEMINI_API_KEY'])
model = genai.GenerativeModel('gemini-1.5-flash')

def gemini_response(user_message: str):
    response = model.generate_content(user_message)
    print(response.text)

gemini_response('The opposite of hot is')

👍 Expected behavior

The following prompt and completion attributes must be set (by analogy with openai and anthropic):

gen_ai.prompt.0.content: "The opposite of hot is\n"
gen_ai.prompt.0.role: "user"
gen_ai.completion.0.content: "cold\n"
gen_ai.completion.0.role: "assistant"

👎 Actual Behavior with Screenshots

Instead, these attributes are set:

gen_ai.completion.0.content: "cold\n"
gen_ai.prompt.0.user: "The opposite of hot is\n"

🤖 Python Version

3.12

📃 Provide any additional context for the Bug.

No response

👀 Have you spent some time to check if this bug has been raised before?

  • I checked and didn't find similar issue

Are you willing to submit PR?

Yes I am willing to submit a PR!

Copy link

dosubot bot commented Dec 19, 2024

To address the issue with the incorrect setting of input message attributes for gen_ai.prompt.N.* in the Gemini instrumentation, you can modify the _set_input_attributes function to ensure that the attributes are set correctly based on the request type. Here's a suggested approach:

  1. Modify the _set_input_attributes function to correctly set the gen_ai.prompt.N.* attributes. Ensure that the prompt and role are set according to the expected behavior.

Here's an example of how you can adjust the code:

def _set_input_attributes(span, llm_request_type, kwargs):
    if llm_request_type == LLMRequestTypeValues.COMPLETION:
        set_span_attribute(
            span, f"{SpanAttributes.LLM_PROMPTS}.0.content", kwargs.get("prompt")
        )
        set_span_attribute(
            span, f"{SpanAttributes.LLM_PROMPTS}.0.role", "user"
        )
    elif llm_request_type == LLMRequestTypeValues.CHAT:
        for i, message in enumerate(kwargs.get("messages")):
            set_span_attribute(
                span, f"{SpanAttributes.LLM_PROMPTS}.{i}.content", message.get("content")
            )
            set_span_attribute(
                span, f"{SpanAttributes.LLM_PROMPTS}.{i}.role", message.get("role")
            )
  1. Ensure the completion attributes are set correctly in the _set_span_completions function, similar to how the prompt attributes are set.

This approach will align the attribute setting with the expected behavior, ensuring that both the content and role are correctly recorded for prompts and completions [1][2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@dosubot dosubot bot added the bug Something isn't working label Dec 19, 2024
@nirga
Copy link
Member

nirga commented Dec 19, 2024

You're right! Appreciate the willingness to fix this :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants