You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
LiteLLM completion() model= abc/xyz:c1bc50532a26...81831e55c; provider = replicate
HTTP Request: GET https://huggingface.co/abc/xyz:c1bc50532a2f2...1831e55c/raw/main/tokenizer_config.json "HTTP/1.1 401 Unauthorized"
HTTP Request: POST https://api.replicate.com/v1/models/c1bc50532a2..7c8d29881831e55c/predictions "HTTP/1.1 404 Not Found"
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
2024-12-20T06:42:23.713006Z [error ] Error generating lyrics: litellm.APIError: ReplicateException - Client error '404 Not Found' for url 'https://api.replicate.com/v1/models/c1bc50532a265a8f27a72f4a...81831e55c/predictions'
Couple of issues, unsure why it's trying to lookup Hugginface? second, the prediction endpoint does not have the repo/model in the url, it just has models/ so the 404 makes sense, it should be org/model_name/version or something like that.
Is the replicate setup of private replicate model broken for litellm?
Relevant log output
LiteLLM completion() model= abc/xyz:c1bc50532a26...81831e55c; provider = replicate
HTTP Request: GET https://huggingface.co/abc/xyz:c1bc50532a2f2...1831e55c/raw/main/tokenizer_config.json "HTTP/1.1 401 Unauthorized"
HTTP Request: POST https://api.replicate.com/v1/models/c1bc50532a2..7c8d29881831e55c/predictions "HTTP/1.1 404 Not Found"
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.2024-12-20T06:42:23.713006Z [error ] Error generating lyrics: litellm.APIError: ReplicateException - Client error '404 Not Found' for url 'https://api.replicate.com/v1/models/c1bc50532a265a8f27a72f4a...81831e55c/predictions'
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
^1.55.7
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered:
What happened?
I have a private replicate model.
I've set the replicate env var,
REPLICATE_API_KEY=<key found in the model page>
But when I run it I get the following auth issue:
Couple of issues, unsure why it's trying to lookup Hugginface? second, the prediction endpoint does not have the repo/model in the url, it just has models/ so the 404 makes sense, it should be org/model_name/version or something like that.
Is the replicate setup of private replicate model broken for litellm?
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
^1.55.7
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: