Oleksii Diagiliev
Oleksii Diagiliev
Hi @borislitvak , It's 1.75GB of data in Redis, not 10MB. Also, please note, the data is written to Redis at the row granularity and is accessible (can be queried)...
I would prefer to postpone these changes until we upgrade to Scala 2.12
Hi @iyer-r, I didn't fully understand the idea of continuing loading while some node is down. Could you please clarify how it should happen under the hood in your opinion?...
I have never tried it with Sentinel and not sure if the library was designed to support it.
Hi @leobenkel , `fromRedisKeyPattern()` uses `SCAN` internally. How many keys do you have in total and how many match your pattern? Does it work in general with a smaller number...
Hi, Do you mean that you change "key.column" option or "table" option? Do you specify any SaveMode? Ideally, please share your code.
Could you please try `.mode('append')`?
With the default `SaveMode (ErrorIfExists)` spark checks if the dataframe already exists and returns an error in that case. With `.mode('append')` it skips the check and just writes the data....
It looks like this data type is currently not supported. I will investigate the details later.
@iyer-r , I will take a look if we can support byte arrays. Why doesn't UTF-8 work for you? Not sure I understood the question of converting a byte array...