Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HTTP Worker Thread Blocked and Invalid Type Warning from Opentelemetry _logs #3858

Closed
hyoinandout opened this issue Apr 17, 2024 · 1 comment · Fixed by #3859
Closed

HTTP Worker Thread Blocked and Invalid Type Warning from Opentelemetry _logs #3858

hyoinandout opened this issue Apr 17, 2024 · 1 comment · Fixed by #3859
Labels
bug Something isn't working

Comments

@hyoinandout
Copy link
Contributor

This issue might be related to the OpenTelemetry attributes and locks utilized by an application, resulting in one of deadlock symptoms.

Describe your environment Describe any aspect of your environment relevant to the problem, including your Python version, platform, version numbers of installed dependencies, information about your cloud hosting provider, etc. If you're reporting a problem with a specific version of a library in this repo, please check whether the problem has been fixed on main.

  • python version: 3.12
opentelemetry-api==1.22.0
opentelemetry-exporter-otlp-proto-common==1.22.0
opentelemetry-exporter-otlp-proto-grpc==1.22.0
opentelemetry-instrumentation==0.43b0
opentelemetry-instrumentation-requests==0.43b0
opentelemetry-proto==1.22.0
opentelemetry-sdk==1.22.0
opentelemetry-semantic-conventions==0.43b0
opentelemetry-util-http==0.43b0
sentry-sdk==1.39.2

Steps to reproduce
Describe exactly how to reproduce the error. Include a code sample if applicable.

  1. Start the HTTP worker threads in the application.
  2. Send multiple REST API requests to a specific endpoint.
  3. Monitor the behavior of the HTTP worker threads or the log files for any warning messages.

What is the expected behavior?
What did you expect to see?

  • The HTTP worker threads should handle incoming requests to the endpoint without any blocking or warning messages related to attribute types.

What is the actual behavior?
What did you see instead?

  • The HTTP worker threads are getting blocked within the handle function while waiting for a lock from opentelemetry/attributes/__init__.py. Additionally, warning log messages related to an invalid type (NoneType) for attribute 'exception.message' are being raised by the Server Thread.

Additional context
Add any other context about the problem here.

  • Every time a thread is blocked, the below log from the application is being emitted:
2024-04-11 09:22:57,126 - 1 CP Server Thread-6 opentelemetry.attributes WARNING Invalid type NoneType for attribute 'exception.message' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types
  • Filtered coredump of the application
...
        current_app.logger.warning(

    (Python) File "/opt/.pyenv/versions/3.12.1/lib/python3.12/logging/__init__.py", line 1551, in warning
        self._log(WARNING, msg, args, **kwargs)
    (Python) File "/opt/.pyenv/versions/3.12.1/lib/python3.12/logging/__init__.py", line 1684, in _log
        self.handle(record)
    (Python) File "/opt/.pyenv/versions/3.12.1/lib/python3.12/logging/__init__.py", line 1700, in handle
        self.callHandlers(record)
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/integrations/logging.py", line 99, in sentry_patched_callhandlers
        return old_callhandlers(self, record)
    (Python) File "/opt/.pyenv/versions/3.12.1/lib/python3.12/logging/__init__.py", line 1762, in callHandlers
        hdlr.handle(record)
    (Python) File "/opt/.pyenv/versions/3.12.1/lib/python3.12/logging/__init__.py", line 1028, in handle
        self.emit(record)
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/opentelemetry/sdk/_logs/_internal/__init__.py", line 542, in emit
        self._logger.emit(self._translate(record))
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/opentelemetry/sdk/_logs/_internal/__init__.py", line 523, in _translate
        return LogRecord(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/opentelemetry/sdk/_logs/_internal/__init__.py", line 183, in __init__
        "attributes": BoundedAttributes(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/opentelemetry/attributes/__init__.py", line 154, in __init__
        self[key] = value
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/opentelemetry/attributes/__init__.py", line 173, in __setitem__
        value = _clean_attribute(key, value, self.max_value_len)
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/opentelemetry/attributes/__init__.py", line 100, in _clean_attribute
        _logger.warning(

    (Python) File "/opt/.pyenv/versions/3.12.1/lib/python3.12/logging/__init__.py", line 1551, in warning
        self._log(WARNING, msg, args, **kwargs)
    (Python) File "/opt/.pyenv/versions/3.12.1/lib/python3.12/logging/__init__.py", line 1684, in _log
        self.handle(record)
    (Python) File "/opt/.pyenv/versions/3.12.1/lib/python3.12/logging/__init__.py", line 1700, in handle
        self.callHandlers(record)
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/integrations/logging.py", line 108, in sentry_patched_callhandlers
        integration._handle_record(record)
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/integrations/logging.py", line 79, in _handle_record
        self._handler.handle(record)
    (Python) File "/opt/.pyenv/versions/3.12.1/lib/python3.12/logging/__init__.py", line 1028, in handle
        self.emit(record)
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/integrations/logging.py", line 179, in emit
        return self._emit(record)
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/integrations/logging.py", line 253, in _emit
        hub.capture_event(event, hint=hint)
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/hub.py", line 349, in capture_event
        rv = client.capture_event(event, hint, scope)
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/client.py", line 586, in capture_event
        event_opt = self._prepare_event(event, hint, scope)
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/client.py", line 383, in _prepare_event
        event = serialize(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 409, in serialize
        serialized_event = _serialize_node(event, **kwargs)
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 237, in _serialize_node
        return _serialize_node_impl(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 345, in _serialize_node_impl
        v = _serialize_node(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 237, in _serialize_node
        return _serialize_node_impl(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 345, in _serialize_node_impl
        v = _serialize_node(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 237, in _serialize_node
        return _serialize_node_impl(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 372, in _serialize_node_impl
        _serialize_node(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 237, in _serialize_node
        return _serialize_node_impl(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 345, in _serialize_node_impl
        v = _serialize_node(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 237, in _serialize_node
        return _serialize_node_impl(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 345, in _serialize_node_impl
        v = _serialize_node(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 237, in _serialize_node
        return _serialize_node_impl(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 372, in _serialize_node_impl
        _serialize_node(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 237, in _serialize_node
        return _serialize_node_impl(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 345, in _serialize_node_impl
        v = _serialize_node(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 237, in _serialize_node
        return _serialize_node_impl(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 345, in _serialize_node_impl
        v = _serialize_node(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 237, in _serialize_node
        return _serialize_node_impl(

    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/sentry_sdk/serializer.py", line 334, in _serialize_node_impl
        obj = dict(iteritems(obj))
    (Python) File "<frozen _collections_abc>", line 893, in __iter__
    (Python) File "/opt/virtualenv/lib/python3.12/site-packages/opentelemetry/attributes/__init__.py", line 192, in __iter__
        with self._lock:
...

The blocked HTTP thread, along with the warning messages, are impacting the overall performance and behavior of the application. I wonder whether this deadlock symptom is a known issue.

@hyoinandout hyoinandout added the bug Something isn't working label Apr 17, 2024
@hyoinandout hyoinandout changed the title HTTP Worker Thread Blocked and Invalid Type Warning from Opentelemetry _logs SDK HTTP Worker Thread Blocked and Invalid Type Warning from Opentelemetry _logs Apr 17, 2024
@hyoinandout
Copy link
Contributor Author

hyoinandout commented May 21, 2024

It turns out to be true;
From
image
a thread holds a lock.

Then it reaches to the deadlock by trying to achieve the lock which the thread is already holding
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant