You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
We want to prevent log duplicates when re-consumed from Kafka backlog (in case we need to 'replay' logs from Kafka to fix some data loss caused by fluentd or ES problem)
Our pipeline:
Describe the solution you'd like out_kafka plugin could generate and add key with random ID to the record.
This key later would be used as _id in ElasticSearch.
Describe alternatives you've considered
Currently we are using Fluentd plugin genhashvalue which calculates hash for the log, but this is not very elegant solution for the problem.
It would be great if Fluentbit adds random ID to the record
Additional context
🙏
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
We want to prevent log duplicates when re-consumed from Kafka backlog (in case we need to 'replay' logs from Kafka to fix some data loss caused by fluentd or ES problem)
Our pipeline:
Describe the solution you'd like
out_kafka
plugin could generate and add key withrandom ID
to the record.This key later would be used as
_id
in ElasticSearch.Describe alternatives you've considered
Currently we are using Fluentd plugin
genhashvalue
which calculates hash for the log, but this is not very elegant solution for the problem.It would be great if Fluentbit adds random ID to the record
Additional context
🙏
The text was updated successfully, but these errors were encountered: