spark-redis icon indicating copy to clipboard operation
spark-redis copied to clipboard

How to implement toRedisHASH(keyMaps: RDD[(String, Map[String, String])], ttl: Int = 0)

Open zhou533 opened this issue 8 years ago • 5 comments

The current implementation of toRedisHASH seems write one key per time. Now, I wanna write a RDD [(String, Map[String, String])], is there possible? or any suggestions?

zhou533 avatar Mar 11 '17 10:03 zhou533

This is interesting as well. Why isn't this the RDD returned from fromRedisHash?? You lose the key-context with the current API. You cannot tell which hashes belong to which keys, without embedding the keys themselves in the hash.

Or am I missing something fundamental?

mnarrell avatar Apr 17 '17 21:04 mnarrell

current the implementation of save multi map to redis

    val wc = sc.parallelize(List(
      ("hello", "1")
    ))
    sc.toRedisHASH(wc, "map1")
    sc.toRedisHASH(wc, "map2")
    sc.toRedisHASH(wc, "map3")

what if I have an rdd like

    val wcBatch = sc.parallelize(List(
      ("map1", "hello", "1"),
      ("map2", "hello", "1"),
      ("map3", "hello", "1")
    ))

How can I batch save multi map into redis with single method call?

this batch style also work for list, zset and so on..

PS: If I use pure redis api, I can do this job with pipeline

    val rdd = sc.parallelize(List(
      ("map1", "key1", "value1"),
      ("map1", "key2", "value2"),
      ("map1", "key3", "value3"),
      ("map2", "key1", "value1")
    ))
    rdd.foreachPartition{iter =>
      val redis = new Jedis("localhost",6379,400000)
      val ppl = redis.pipelined()
      iter.foreach{row =>
        val mapKey = row._1
        val key = row._2
        val value = row._3
        ppl.hmset(mapKey, Map(key -> value))
      }
      ppl.sync()
    }

zqhxuyuan avatar Jun 14 '17 07:06 zqhxuyuan

+1 on this issue.

simalince avatar Oct 18 '19 18:10 simalince

@gkorland @fe2s Is there any progress?

charsyam avatar Dec 13 '19 00:12 charsyam

@charsyam , sorry, no progress on this yet

fe2s avatar Dec 15 '19 17:12 fe2s