Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Enable API Logging with Helicone #598

Closed
wants to merge 3 commits into from

Conversation

Kabilan108
Copy link

API requests will be logged via Helicone when the HELICONE_API_KEY variable is set

Describe the changes you have made:

  • Added if block to set litellm.api_base and litellm.headers when HELICONE_API_KEY is provided.

Reference any relevant issue (Fixes #000)

  • I have performed a self-review of my code:

I have tested the code on the following OS:

  • Windows
  • MacOS
  • Linux

AI Language Model (if applicable)

  • GPT4
  • GPT3
  • Llama 7B
  • Llama 13B
  • Llama 34B
  • Huggingface model (Please specify which one)

API requests will be logged via Helicone when the `HELICONE_API_KEY` variable is set
Comment on lines 13 to 16
if os.getenv("HELICONE_API_KEY"):
litellm.api_base = "https://oai.hconeai.com/v1"
litellm.headers = {"Helicone-Auth": f"Bearer {os.getenv('HELICONE_API_KEY')}"}

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the idea of making it simple to use, but this approach actually breaks the existing --api_base parameter functionality and prevents users from being able to also test out a local model or another API proxy.

OpenInterpreter-Helicone

I think we should consider making thrid party integrations like this opt-in via a CLI parameter or a flag in the user's config.yaml file that can be overridden.

Maybe adding some sort of integrations array to the config.yaml and then if the user has helicone as an entry in that array, we can enable helicone and try to pick up the API key? It might need to follow something similar to the functionality we use in interpreter/terminal_interface/validate_llm_settings.py to make sure we can find the HELICONE_API_KEY if Helicone is enabled.

Overall, this a solid idea to add and nice to be able to track Open Interpreter's token usage and other datapoints in the Helicone dashboard.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the feedback! I implemented this approach and pushed a new commit!
Right now, I added the validation for integrations in the setup_llm.py file, but do you think it would be better to have a separate module for it? I've only added helicone for now, but I am thinking of adding support for other LLM monitoring tools like LangSmith and llm.report.

@ishaan-jaff
Copy link
Contributor

@Kabilan108 we expose callbacks through litellm - you can set a custom callback to log data to Helicone if you'd like: https://docs.litellm.ai/docs/observability/custom_callback

-  add support for specifying external integrations in the users
config file.
- Currently supports API request logging with Helicone.
@ericrallen ericrallen added the External This Issue is related to an external dependency and not Open Interpreter's core label Oct 27, 2023
temperature: 0
integrations: [helicone]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This might be good to put in an example in the docs, but we won’t want to enable this by default in the core config.yaml.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It’s hard to tell, but I don’t think there are any actual changes to this file that are relevant to this PR.

@KillianLucas
Copy link
Collaborator

KillianLucas commented May 5, 2024

Hey Kabilan! We're now approaching integrations with simple guides like this one. Let me know if you'd like to reopen this PR, or make a new one into our docs showing how to integrate Open Interpreter with Helicone. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
External This Issue is related to an external dependency and not Open Interpreter's core
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants