迭代RDD并根据一个值按键搜索另一个RDD

时间:2014-11-10 16:02:16

标签: scala apache-spark

实际上我有两个结构相同的RDD [(String, (Int, scala.collection.immutable.Map[String,Int], Double))]

rdd1

(A,(1,Map(VVV -> 1),0.0))
(B,(26,Map(DDD -> 2, PPP -> 7, OOO -> 2, EEE -> 3, LLL -> 12),1.35))
(C,(2,Map(VVV -> 2),0.0))

RDD2

(OOO,(2,Map(B -> 2),0.0))
(DDD,(2,Map(B -> 2),0.0))
(PPP,(7,Map(B -> 7),0.0))
(LLL,(12,Map(B -> 12),0.0))
(VVV,(3,Map(C -> 2, A -> 1),0.63))
(EEE,(3,Map(B -> 3),0.0))

我需要迭代rdd1并为每个地图键((VVV), (DDD, PPP, OOO, EEE, LLL), (VVV))搜索rdd2中的密钥,然后调用一个函数来执行计算。

这样做的方法是什么?那可能吗?迭代RDD并根据一个值按键搜索另一个RDD。

我测试使用:

def calculate(t: String, c: Int, m: scala.collection.immutable.Map[String,Int], e: Double, r: org.apache.spark.rdd.RDD[(String, (Int, scala.collection.immutable.Map[String,Int], Double))]) = {    
    Tuple5(t,c,m,e,r.lookup("DDD"))
}
val newRDD = rdd1.map(f => calculate(f._1, f._2._1, f._2._2, f._2._3, rdd2))

当我执行newRDD.take(10).foreach(println(_))

它给我以下错误消息:

14/11/10 13:30:46 ERROR Executor: Exception in task ID 54 scala.MatchError: null 
    at org.apache.spark.rdd.PairRDDFunctions.lookup(PairRDDFunctions.scala:572)

另一项测试是:

rdd1.foreach(a => { rdd2.foreach(b => { println(b)}) })

但它给了我以下错误消息:

14/11/10 13:35:23 ERROR Executor: Exception in task ID 55 java.lang.NullPointerException
    at org.apache.spark.rdd.RDD.foreach(RDD.scala:715)

1 个答案:

答案 0 :(得分:2)

我会将您的地图转换为元组(为原始rdd1中的每个地图条目提供一个条目的RDD),然后加入:

val splitRdd1: RDD[(String, (String, Int, Int, Double))] =
  rdd1.flatMap {case (s, (i, map, d)) => map.toList.map {
    case (k, v) => (k, (s, i, v, d))
    }
  }
val newRdd = splitRdd1.join(rdd2).map{...}