Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(kafka): update old Kafka #19199

Draft
wants to merge 1 commit into
base: develop
Choose a base branch
from
Draft
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -14,15 +14,15 @@ freshnessValidatedDate: never

The New Relic Java agent automatically collects data from [Kafka](https://kafka.apache.org/documentation/)'s Java clients library. Because Kafka is a high-performance messaging system that generates a lot of data, you can customize the agent for your app's specific throughput and use cases.

This document explains how to collect and view three types of Kafka data:
This document explains how to collect and view these Kafka data types:

* [Kafka metrics](#view-kafka-metrics)
* [Kafka events](#collect-kafka-events)
* [Enable Kafka Streams transactions](#collect-kafka-streams-transactions)
* [Kafka Streams transactions](#collect-kafka-streams-transactions)
* [Kafka distributed traces](#collect-kafka-distributed-traces)

<Callout variant="tip">
We also have a Kafka integration. For details on that, see [Kafka monitoring integration](/docs/integrations/host-integrations/host-integrations-list/kafka-monitoring-integration).
We also have a Kafka integration, which [does XYZ] For details on that, see [Kafka monitoring integration](/docs/integrations/host-integrations/host-integrations-list/kafka-monitoring-integration).
</Callout>

## Requirements [#requirements]
Expand All @@ -31,7 +31,7 @@ Kafka clients instrumentation is available in Java agent versions 4.12.0 or high

## View Kafka metrics

After [installation](/docs/agents/java-agent/installation/install-java-agent), the agent automatically reports rich Kafka metrics with information about messaging rates, latency, lag, and more. The Java agent collects all [Kafka consumer and producer metrics](https://kafka.apache.org/documentation/#monitoring) (but not connect or stream metrics).
After [installation](/docs/agents/java-agent/installation/install-java-agent), the Java agent automatically reports rich Kafka metrics with information about messaging rates, latency, lag, and more. The agent collects all [Kafka consumer and producer metrics](https://kafka.apache.org/documentation/#monitoring) (but not connect or stream metrics).

To view these metrics, create a custom dashboard:

Expand Down Expand Up @@ -84,7 +84,7 @@ To view these metrics, create a custom dashboard:
* The metric is not numeric or its value is `NaN`. New Relic only accepts metrics with a numeric value.
</Callout>

## Enable Kafka event collection [#collect-kafka-events]
## View Kafka event collection [#collect-kafka-events]

You can configure the agent to collect event data instead of metric timeslice data (for the difference between metric timeslice and event data, see [data collection](/docs/using-new-relic/metrics/analyze-your-metrics/data-collection-metric-timeslice-event-data#overview)). This allows you to use [NRQL](/docs/insights/nrql-new-relic-query-language/using-nrql/introduction-nrql) to filter and facet the default Kafka metrics. When enabled, the agent collects one Kafka event every 30 seconds. This event contains all of the the data from [Kafka consumer and produce metrics](https://kafka.apache.org/documentation/#monitoring) captured since the previous event.

Expand Down Expand Up @@ -128,7 +128,7 @@ class_transformer:
enabled: true
```

## Enable Kafka config events [#kafka-config]
## Kafka config events [#kafka-config]

The `kafka-clients-config` instrumentation module will periodically send events with the contents of your Kafka client configuration. This module is available since agent 8.6.0 and is disabled by default.

Expand All @@ -140,7 +140,7 @@ class_transformer:
enabled: true
```

## Enable Kafka Streams transactions [#collect-kafka-streams-transactions]
## Kafka Streams transactions [#collect-kafka-streams-transactions]

If you're using Kafka Streams, by default we do not enable transactions. This is to prevent unnecessary overhead because Kafka applications tend to have high throughput.

Expand Down Expand Up @@ -168,7 +168,7 @@ class_transformer:
enabled: true
```

## Enable Kafka distributed traces [#collect-kafka-distributed-traces]
## Kafka distributed traces [#collect-kafka-distributed-traces]

The Java agent can also collect [distributed traces](/docs/apm/distributed-tracing/getting-started/introduction-distributed-tracing) from Kafka clients. Because Kafka Streams runs on top of Kafka clients, the steps to manage distributed tracing also apply. Enabling traces doesn't affect the agent's normal operations; it will still report metric or event data from Kafka.

Expand Down
Loading