Redis on spark:无法访问类BinaryJedis中的变量管道

时间:2015-12-12 21:12:35

标签: scala apache-spark redis spark-streaming

我正在尝试从火花写入redis。但是我得到一个编译时错误,说“无法在redis.clients.jedis.Jedis中访问类BinaryJedis中的变量管道”。我的代码如下(部分显示):

    import org.sedis._
    import redis.clients.jedis._
    ...
    val myRDD = KafkaUtils.createStream(ssc, zkQuorum, group, topic).map(_._2).window(Seconds(300), Seconds(10))
    myRDD.foreachRDD( rdd => {rdd.foreachPartition(it =>{
      val pool = new Pool(new JedisPool(new JedisPoolConfig(), "localhost", 6379, 2000))
      pool.withJedisClient { client => 
        val pipeline = client.pipeline()
        it.foreach {
          case (a,b,c) => pipeline.hmset(a,Map("b" -> b, "c" -> c))
        }
      }

   })})

我得到的错误如下:

    error: variable pipeline in class BinaryJedis cannot be accessed in redis.clients.jedis.Jedis
    Access to protected variable pipeline not permitted because enclosing object MainExample in package examples is not a subclass of class BinaryJedis in package jedis where target is defined 
    val pipeline = client.pipeline()

我搜索了一个解决方案,却找不到一个。 有人可以帮我吗? 提前致谢。

1 个答案:

答案 0 :(得分:0)

通过删除分区并用写入redis的每个数据元组的新jedis实例替换Jedis池来解决上述问题,如下所示。这对我有用。

import org.sedis._
import redis.clients.jedis._
...
val myRDD = KafkaUtils.createStream(ssc, zkQuorum, group, topic)
                      .map(_._2)
                      .window(Seconds(300), Seconds(10))

myRDD.foreachRDD( rdd => {
      rdd.foreach( case(a, b, c) =>{
           val jedis = new Jedis("localhost", 6379)
           val pipeline = jedis.pipelined
           pipeline.hmset(a,Map("b" -> b, "c" -> c))
    })
})