Oleksii Diagiliev
Oleksii Diagiliev
The write function uses as many connections as there are partitions in RDD/DataFrame, bounded by the max number of connections in connection pool. Each partition obtains a connection from the...
Hi @shadibch , could you please the sample dataframe (after the transformation) that you try to save?
Could you provide some example I can reproduce please?
Hi @gwwallace , there is no such option
The master branch (ver. 2.3.1-SNAPSHOT) currently uses jedis fork ``` com.redislabs jedis 3.0.0-20181113.105826-9 ``` @gkorland , could you please advice which Redis version is it compatible with?
Hi @patrickmao93 , I was not able to reproduce it. My code: ```scala val spark = SparkSession .builder() .master("local[*]") .appName("ExtractRawTablesFromMySQL") .config("spark.redis.host", "my-redis.url.com") .config("spark.redis.port", "6379") .getOrCreate() val sc = spark.sparkContext val...
Please also provide the code of how `sc` value initialized.