0

This Fluentd service consumes from Kafka and stores data in OpenSearch.

The longest logs are about 32,700 bytes, while typical logs are around 10 to 15 MB.

<buffer>
      chunk_limit_size 50m
      queue_limit_length 256
      flush_mode immediate
      flush_thread_count 7
      retry_max_times 3
      retry_wait 10s
      overflow_action throw_exception
</buffer>
2024-04-29 19:02:02 +0000 [warn]: #0 failed to write data into buffer by buffer overflow action=:throw_exception
2024-04-29 19:02:02 +0000 [warn]: #0 suppressed same stacktrace
2024-04-29 19:02:02 +0000 [warn]: #0 emit transaction failed: error_class=Fluent::Plugin::Buffer::BufferOverflowError error="buffer space has too many data" location="/usr/local/bundle/gems/fluentd-1.14.6/lib/fluent/plugin/buffer.rb:327:in `write'" tag="event-***"

Logs with the "event-**" tag are duplicated in OpenSearch.

Please add any other solutions or suggestion..

The retry_forever true option also results in duplicates in OpenSearch

New contributor
Kailey is a new contributor to this site. Take care in asking for clarification, commenting, and answering. Check out our Code of Conduct.
1
  • Whoever wrote the message buffer space has too many data needs to be fired, and ditto the management who approved it.
    – user207421
    43 mins ago

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Browse other questions tagged or ask your own question.