You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As per readme file it is mentioned that: max_send_limit_bytes - default: nil - Max byte size to send message to avoid MessageSizeTooLarge. For example, if you set 1000000(message.max.bytes in kafka), Message more than 1000000 byes will be dropped.
However, I see that even if I have set max_send_limit_bytes as 5 MB, there are requests being sent to kafka for 20MB and 16 MB.
Why is fluentd not able to limit the message size? Is there anything else needs to be configured here?
Please suggest asap.
#323 (comment) As per this comment, it looks like that documentation does not have right information and it needs an update. This comment is also a bit ambiguous, Kindly check on this and provide an explanation on what all configurations have an impact and can be configured to avoid MessageSizeTooLarge exception.
Describe the bug
As per readme file it is mentioned that: max_send_limit_bytes - default: nil - Max byte size to send message to avoid MessageSizeTooLarge. For example, if you set 1000000(message.max.bytes in kafka), Message more than 1000000 byes will be dropped.
However, I see that even if I have set max_send_limit_bytes as 5 MB, there are requests being sent to kafka for 20MB and 16 MB.
Why is fluentd not able to limit the message size? Is there anything else needs to be configured here?
Please suggest asap.
To Reproduce
configuration for the plugin:
Expected behavior
Fluentd should limit the message being sent to kafka to 5 MB as max_send_limit_bytes is set to 5 MB
Your Environment
Your Configuration
configuration for the plugin:
Your Error Log
Additional context
No response
The text was updated successfully, but these errors were encountered: