Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Unclear Error message with expired enterprise license #7289

Open
parkerkain-8451 opened this issue Dec 18, 2024 · 0 comments
Open

[Bug]: Unclear Error message with expired enterprise license #7289

parkerkain-8451 opened this issue Dec 18, 2024 · 0 comments
Labels
bug Something isn't working mlops user request

Comments

@parkerkain-8451
Copy link

parkerkain-8451 commented Dec 18, 2024

What happened?

Our enterprise license expired and needed to be replaced - however the message we got when trying to investigate what was going on looked like:
Exception: You must be a LiteLLM Enterprise user to use this feature. If you have a license please set LITELLM_LICENSE in your env. If you want to obtain a license meet with us here: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat.

This looks like the issue is us not setting an env variable - however we did have it set, it was just pointing to an expired license. A separate error message for no license detected versus invalid license detected would have helped speed up our debugging process immensely.

Relevant log output

proxy-1     |     raise Exception(
proxy-1     | Exception: You must be a LiteLLM Enterprise user to use this feature. If you have a license please set `LITELLM_LICENSE` in your env. If you want to obtain a license meet with us here: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat.
proxy-1     | Pricing: https://www.litellm.ai/#pricing
proxy-1     |
proxy-1     | ERROR:    Application startup failed. Exiting.
proxy-1     |
proxy-1     | #------------------------------------------------------------#
proxy-1     | #                                                            #
proxy-1     | #            'This product would be better if...'             #
proxy-1     | #        https://github.com/BerriAI/litellm/issues/new        #
proxy-1     | #                                                            #
proxy-1     | #------------------------------------------------------------#
proxy-1     |
proxy-1     |  Thank you for using LiteLLM! - Krrish & Ishaan
proxy-1     |
proxy-1     |
proxy-1     |
proxy-1     | Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
proxy-1     |
proxy-1     |
proxy-1 exited with code 3

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

v1.54.0

Twitter / LinkedIn details

No response

@parkerkain-8451 parkerkain-8451 added the bug Something isn't working label Dec 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working mlops user request
Projects
None yet
Development

No branches or pull requests

1 participant