Compressed size is not used for validating message in producer when compression is enabled
Versions
Sarama Version : v1.29.1 Go version: 1.13
Configuration
What configuration values are you using for Sarama and Kafka?
Producer.Compression = sarama.CompressionGZIP
Producer.MaxMessageBytes = 2097164
Logs
kafka: Failed to produce message to topic: kafka server: Message was too large, server rejected it to avoid allocation error.
Problem Description
Even after setting producer compression by default to sarama.CompressionGZIP. We are noticing in asyn_producer.go message byte size validation checks against uncompressed message even though Kafka broker allows compressed message based on max size settings . This is cause issues sending message that has very low compressed size compared to uncompressed message. Can we get some help on it?
https://github.com/Shopify/sarama/blob/6693712f54b76066ea239255c30585832983947d/async_producer.go#L367
Any updates on this issue? Thanks
@parikdepa the issue is correct, unfortunately Sarama does a local compare of the pre-compression bytes, and it's probably non-trivial to fix at the moment.
However, this was always (as far as I know) a convenient client-side safety net, and the remote kafka cluster will reject the produce request if you send a compressed message that it is larger than the configured maximum size that it can accept. You could just set Producer.MaxMessageBytes = sarama.MaxRequestSize and you're unlikely to see any ill effects.
Thank you for taking the time to raise this issue. However, it has not had any activity on it in the past 90 days and will be closed in 30 days if no updates occur. Please check if the main branch has already resolved the issue since it was raised. If you believe the issue is still valid and you would like input from the maintainers then please comment to ask for it to be reviewed.