Tudor Marghidanu
Tudor Marghidanu
I think I got it fixed by adding the `api.version.request` and `broker.version.fallback` to the Consumer arguments. ``` args = {'bootstrap.servers': self.brokers, 'group.id': self.group_id, 'enable.auto.commit': False, 'debug': 'protocol,msg,fetch,broker', 'api.version.request': False, 'broker.version.fallback':...
Well it actually stopped crashing but there's a more visible error now: `KafkaError{code=_BAD_MSG,val=-199,str="Local: Bad message format"}`
So far only these two: 1.1.0 and 1.2.0 We are reading messages in batch, 1000 at the time. Could this be related to this? Maybe the buffer is not large...
We're using the librdkafka package from Alpine and the messages are not compressed.
Use separate developments, something like local::lib. This way you can add multiple installations against different Python versions.
I've also bumped into this issue while bulk-exporting the blocks into another format.
This does not work in macOS at all. First, the slice error and then the backupable error.
I tried this option in 5.3.1 and it doesn't seem to work. Is there a way to specify the schema subject name when creating a stream?