sarama icon indicating copy to clipboard operation
sarama copied to clipboard

set buffer memory to control max memory usage when produce lots of topic

Open mikepop7 opened this issue 6 months ago • 1 comments

Description

has any config to set max memory usage for all topic on producer? like java kafka sdk: props.put("buffer.memory", 33554432); // 32MB

Versions
Sarama Kafka Go
v1.45.1 3.6.2 1.23.0
Configuration
cf := sarama.NewConfig()
cf.Producer.Return.Successes = true
cf.Producer.Return.Errors = true
cf.Producer.Compression = sarama.CompressionLZ4
cf.Producer.Flush.Bytes = 1048576
cf.Producer.Flush.Messages = 10000
cf.Producer.Flush.Frequency = 10 * time.Second
cf.Producer.Flush.MaxMessages = 12000
cf.Producer.MaxMessageBytes = 10485760

Additional Context

mikepop7 avatar Jun 05 '25 09:06 mikepop7

implemented by: https://github.com/IBM/sarama/pull/3088/

JunliWang avatar Jun 06 '25 22:06 JunliWang

implemented by: #3088

thanks for reply. conf.Producer.Retry.MaxBufferBytes setting control retry send message on retryHandler(), it works only when produce to one broker error(e.g. greater thanMaxRequestSize/MaxMessageBytes/Flush.MaxMessages). what i need is control the max memory usage in client, no matter how many brokers(e.g. 30 brokers) or topics (e.g. 100 topics), this feature doesn't work in my scenario.

mikepop7 avatar Jun 25 '25 08:06 mikepop7

Thank you for taking the time to raise this issue. However, it has not had any activity on it in the past 90 days and will be closed in 30 days if no updates occur. Please check if the main branch has already resolved the issue since it was raised. If you believe the issue is still valid and you would like input from the maintainers then please comment to ask for it to be reviewed.

github-actions[bot] avatar Sep 23 '25 10:09 github-actions[bot]