Volodymyr Pavlenko
Volodymyr Pavlenko
Seems that disabling `com.spotify.heroic.end_bucket_stategy` feature fixes this. Can be disabled in query time with including `-com.spotify.heroic.end_bucket_stategy` as an additional feature. Documentation says: ``` Enabled by default. Use the legacy bucket...
Hi Mark, thanks for your reply! 1. Is there a way to know if the connection has been already established? It seemed that it's quite encapsulated inside the Lettuce code,...
I am not sure if this will actually work in our approach due to: 1) Even if the connection is open before sending the batch, the race condition still exists:...
In this case, it'd be great to update documentation, since it took a lot of time for us to find the reason for the race condition. If pipelining should not...
Just to give you more context, we have a system with a very high throughput, which is also latency critical. We need to get batches of randomly spread keys (total...
MGET would be split by slot, and splitting 10-100 sized multigets over 16384 slots will simply transform them into single gets. We tried this before, and performance was very poor....