Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Invalid double await in ollama embeddings in Proxy (fix in report) #7366

Open
aguadoenzo opened this issue Dec 22, 2024 · 2 comments
Open
Labels
bug Something isn't working

Comments

@aguadoenzo
Copy link

What happened?

Using OpenAI sdk, LiteLLM proxy. When creating embeddings using Ollama nomic-embed-text, the proxy throws the following error.

litellm-1  | Traceback (most recent call last):
litellm-1  |   File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 3070, in aembedding
litellm-1  |     response = await init_response  # type: ignore
litellm-1  |                ^^^^^^^^^^^^^^^^^^^
litellm-1  |   File "/usr/local/lib/python3.11/site-packages/litellm/llms/ollama/completion/handler.py", line 77, in ollama_aembeddings
litellm-1  |     response_json = await response.json()
litellm-1  |                     ^^^^^^^^^^^^^^^^^^^^^
litellm-1  | TypeError: object dict can't be used in 'await' expression

This issue comes from here, the response object is already awaited before, so there's no need for a second await.

response_json = await response.json()

The following change fixes the issue (tested locally):

response_json = response.json() 

I'm submitting an issue and not a PR because I'm not familiar enough with the code base and the impact this change might have. Also I can't really write tests for a PR

Relevant log output

litellm-1  | Traceback (most recent call last):
litellm-1  |   File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 3070, in aembedding
litellm-1  |     response = await init_response  # type: ignore
litellm-1  |                ^^^^^^^^^^^^^^^^^^^
litellm-1  |   File "/usr/local/lib/python3.11/site-packages/litellm/llms/ollama/completion/handler.py", line 77, in ollama_aembeddings
litellm-1  |     response_json = await response.json()
litellm-1  |                     ^^^^^^^^^^^^^^^^^^^^^
litellm-1  | TypeError: object dict can't be used in 'await' expression

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

ghcr.io/berriai/litellm:main-latest which at time of writing would be v1.55.8-stable

Twitter / LinkedIn details

No response

@aguadoenzo aguadoenzo added the bug Something isn't working label Dec 22, 2024
@datGryphon
Copy link

I also experienced this bug on 1.55.9

@hermanmak
Copy link

Also experiencing this issue

Obsidian copilot -> OpenWebui's Openai Proxy Endpoint -> LiteLLM Ollama Proxy -> Ollama snowflake-arctic-embed2

litellm | File "/usr/local/lib/python3.13/site-packages/litellm/llms/ollama/completion/handler.py", line 55, in ollama_aembeddings
litellm | response_json = await response.json()
litellm | ^^^^^^^^^^^^^^^^^^^^^
litellm | TypeError: object dict can't be used in 'await' expression

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants