Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Malformed input request: #: extraneous key [chat_id] is not permitted (Bedrock/Claude Models) #7416

Closed
sammcj opened this issue Dec 25, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@sammcj
Copy link
Contributor

sammcj commented Dec 25, 2024

What happened?

When using LiteLLM with Amazon Bedrock, often requests will fail to Anthropic Claude Models with 'Malformed input request: #: extraneous key [chat_id] is not permitted'

To reproduce:

  1. Setup LiteLLM with access to models running on Amazon Bedrock
  2. Use the following example config:
model_list:
  - model_name: bedrock-anthropic.claude-3-5-sonnet-20241022-v2:0
    litellm_params:
      model: "bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0"
      aws_access_key_id: os.environ/AWS_ACCESS_KEY_ID
      aws_secret_access_key: os.environ/AWS_SECRET_ACCESS_KEY
      aws_region_name: "us-west-2"
  1. Make any request to bedrock-anthropic.claude-3-5-sonnet-20241022-v2:0 via Open WebUI
  2. See the request fail both in the client and on the litellm proxy container:
400: litellm.BadRequestError: BedrockException - {"message":"The model returned the following errors: Malformed input request: #: extraneous key [chat_id] is not permitted, please reformat your input and try again."}

This also occurs if you set drop_params: True

Relevant log output

09:46:51 - LiteLLM Proxy:ERROR: proxy_server.py:3429 - litellm.proxy.proxy_server.chat_completion(): Exception occured - litellm.BadRequestError: BedrockException - {"message":"The model returned the following errors: Malformed input request: #: extraneous key [chat_id] is not permitted, please reformat your input and try again."}No fallback model group found for original model_group=bedrock-anthropic.claude-3-5-sonnet-20241022-v2:0. Fallbacks=[{'bedrock-anthropic.claude-3-5-sonnet-latest': 'bedrock-anthropic.claude-3-5-haiku-latest'}, {'bedrock-anthropic.claude-3-5-haiku-latest': 'qwen2.5-coder-32b-instruct-128k:q6_k'}, {'bedrock-meta.llama3-2-90b-instruct-v1': 'bedrock-meta.llama3-2-11b-instruct-v1'}, {'qwen2.5-coder-32b-instruct-128k:q6_k': 'qwen2.5-coder-14b-instruct-128k:q6_k'}, {'qwen2.5-coder-14b-instruct-128k:q6_k': 'qwen2.5-coder-7b-instruct-128k:q6_k'}]
Received Model Group=bedrock-anthropic.claude-3-5-sonnet-20241022-v2:0
Available Model Group Fallbacks=None
Error doing the fallback: litellm.BadRequestError: BedrockException - {"message":"The model returned the following errors: Malformed input request: #: extraneous key [chat_id] is not permitted, please reformat your input and try again."}No fallback model group found for original model_group=bedrock-anthropic.claude-3-5-sonnet-20241022-v2:0. Fallbacks=[{'bedrock-anthropic.claude-3-5-sonnet-latest': 'bedrock-anthropic.claude-3-5-haiku-latest'}, {'bedrock-anthropic.claude-3-5-haiku-latest': 'qwen2.5-coder-32b-instruct-128k:q6_k'}, {'bedrock-meta.llama3-2-90b-instruct-v1': 'bedrock-meta.llama3-2-11b-instruct-v1'}, {'qwen2.5-coder-32b-instruct-128k:q6_k': 'qwen2.5-coder-14b-instruct-128k:q6_k'}, {'qwen2.5-coder-14b-instruct-128k:q6_k': 'qwen2.5-coder-7b-instruct-128k:q6_k'}] LiteLLM Retried: 4 times, LiteLLM Max Retries: 5 LiteLLM Retried: 4 times, LiteLLM Max Retries: 5
Traceback (most recent call last):
  File "/usr/local/lib/python3.13/site-packages/litellm/llms/bedrock/chat/invoke_handler.py", line 184, in make_call
    response = await client.post(
               ^^^^^^^^^^^^^^^^^^
    ...<4 lines>...
    )
    ^
  File "/usr/local/lib/python3.13/site-packages/litellm/llms/custom_httpx/http_handler.py", line 219, in post
    raise e
  File "/usr/local/lib/python3.13/site-packages/litellm/llms/custom_httpx/http_handler.py", line 177, in post
    response.raise_for_status()
    ~~~~~~~~~~~~~~~~~~~~~~~~~^^
  File "/usr/local/lib/python3.13/site-packages/httpx/_models.py", line 761, in raise_for_status
    raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://bedrock-runtime.us-west-2.amazonaws.com/model/anthropic.claude-3-5-sonnet-20241022-v2:0/converse-stream'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.13/site-packages/litellm/main.py", line 482, in acompletion
    response = await init_response
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/litellm/llms/bedrock/chat/converse_handler.py", line 115, in async_streaming
    completion_stream = await make_call(
                        ^^^^^^^^^^^^^^^^
    ...<9 lines>...
    )
    ^
  File "/usr/local/lib/python3.13/site-packages/litellm/llms/bedrock/chat/invoke_handler.py", line 230, in make_call
    raise BedrockError(status_code=error_code, message=err.response.text)
litellm.llms.bedrock.common_utils.BedrockError: {"message":"The model returned the following errors: Malformed input request: #: extraneous key [chat_id] is not permitted, please reformat your input and try again."}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.13/site-packages/litellm/proxy/proxy_server.py", line 3318, in chat_completion
    responses = await llm_responses
                ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/litellm/router.py", line 835, in acompletion
    raise e
  File "/usr/local/lib/python3.13/site-packages/litellm/router.py", line 811, in acompletion
    response = await self.async_function_with_fallbacks(**kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/litellm/router.py", line 2811, in async_function_with_fallbacks
    raise original_exception
  File "/usr/local/lib/python3.13/site-packages/litellm/router.py", line 2768, in async_function_with_fallbacks
    raise original_exception
  File "/usr/local/lib/python3.13/site-packages/litellm/router.py", line 2627, in async_function_with_fallbacks
    response = await self.async_function_with_retries(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        *args, **kwargs, mock_timeout=mock_timeout
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/local/lib/python3.13/site-packages/litellm/router.py", line 2989, in async_function_with_retries
    raise original_exception
  File "/usr/local/lib/python3.13/site-packages/litellm/router.py", line 2895, in async_function_with_retries
    response = await self.make_call(original_function, *args, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/litellm/router.py", line 2998, in make_call
    response = await response
               ^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/litellm/router.py", line 960, in _acompletion
    raise e
  File "/usr/local/lib/python3.13/site-packages/litellm/router.py", line 928, in _acompletion
    response = await _response
               ^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/litellm/utils.py", line 1195, in wrapper_async
    raise e
  File "/usr/local/lib/python3.13/site-packages/litellm/utils.py", line 1049, in wrapper_async
    result = await original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/litellm/main.py", line 504, in acompletion
    raise exception_type(
          ~~~~~~~~~~~~~~^
        model=model,
        ^^^^^^^^^^^^
    ...<3 lines>...
        extra_kwargs=kwargs,
        ^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/local/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2146, in exception_type
    raise e
  File "/usr/local/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 767, in exception_type
    raise BadRequestError(
    ...<4 lines>...
    )
litellm.exceptions.BadRequestError: litellm.BadRequestError: BedrockException - {"message":"The model returned the following errors: Malformed input request: #: extraneous key [chat_id] is not permitted, please reformat your input and try again."}No fallback model group found for original model_group=bedrock-anthropic.claude-3-5-sonnet-20241022-v2:0. Fallbacks=[{'bedrock-anthropic.claude-3-5-sonnet-latest': 'bedrock-anthropic.claude-3-5-haiku-latest'}, {'bedrock-anthropic.claude-3-5-haiku-latest': 'qwen2.5-coder-32b-instruct-128k:q6_k'}, {'bedrock-meta.llama3-2-90b-instruct-v1': 'bedrock-meta.llama3-2-11b-instruct-v1'}, {'qwen2.5-coder-32b-instruct-128k:q6_k': 'qwen2.5-coder-14b-instruct-128k:q6_k'}, {'qwen2.5-coder-14b-instruct-128k:q6_k': 'qwen2.5-coder-7b-instruct-128k:q6_k'}]
Received Model Group=bedrock-anthropic.claude-3-5-sonnet-20241022-v2:0
Available Model Group Fallbacks=None
Error doing the fallback: litellm.BadRequestError: BedrockException - {"message":"The model returned the following errors: Malformed input request: #: extraneous key [chat_id] is not permitted, please reformat your input and try again."}No fallback model group found for original model_group=bedrock-anthropic.claude-3-5-sonnet-20241022-v2:0. Fallbacks=[{'bedrock-anthropic.claude-3-5-sonnet-latest': 'bedrock-anthropic.claude-3-5-haiku-latest'}, {'bedrock-anthropic.claude-3-5-haiku-latest': 'qwen2.5-coder-32b-instruct-128k:q6_k'}, {'bedrock-meta.llama3-2-90b-instruct-v1': 'bedrock-meta.llama3-2-11b-instruct-v1'}, {'qwen2.5-coder-32b-instruct-128k:q6_k': 'qwen2.5-coder-14b-instruct-128k:q6_k'}, {'qwen2.5-coder-14b-instruct-128k:q6_k': 'qwen2.5-coder-7b-instruct-128k:q6_k'}] LiteLLM Retried: 4 times, LiteLLM Max Retries: 5 LiteLLM Retried: 4 times, LiteLLM Max Retries: 5
09:46:51 - LiteLLM Proxy:ERROR: _common.py:120 - Giving up chat_completion(...) after 1 tries (litellm.proxy._types.ProxyException)
INFO:     172.22.0.4:33854 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request


### Are you a ML Ops Team?

No

### What LiteLLM version are you on ?

1.55.12

### Twitter / LinkedIn details

_No response_
@sammcj sammcj added the bug Something isn't working label Dec 25, 2024
@sammcj
Copy link
Contributor Author

sammcj commented Dec 25, 2024

I think I've been able to narrow this down to only being an issue with some clients such as Open WebUI.

Is it possible that litellm is exposing the model with the incorrect feature set / prompt format to clients perhaps?

@krrishdholakia
Copy link
Contributor

extraneous key [chat_id] is not permitted, please reformat your input and try again

this is not a litellm error. the client is sending in an extra key - chat_id in the request body

You can drop it by specifying it as an additional_drop_param - https://docs.litellm.ai/docs/completion/drop_params#specify-params-to-drop

@krrishdholakia krrishdholakia closed this as not planned Won't fix, can't repro, duplicate, stale Dec 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants