如何在不进行嵌套转换的情况下创建新的RDD

时间:2018-09-18 02:05:54

标签: scala csv apache-spark rdd

我想用以下格式的记录创建RDD:

(行程,(起点站详细信息),(终点站详细信息))

import org.apache.spark._

val input1 = sc.textFile("data/trips/*")
val header1 = input1.first // to skip the header row
val trips = input1.filter(_ != header1).map(_.split(","))

val input2 = sc.textFile("data/stations/*")
val header2 = input2.first // to skip the header row
val stations = input2.filter(_!=header2).map(_.split(",")).keyBy(_(0).toInt)

def pjoined (joined: (Array[String], Array[String], Array[String])) = {
    println(""+joined._1.deep.mkString(",")+"; "+joined._2.deep.mkString(",")+"; "+joined._3.deep.mkString(","))
}

val joinedtrips = trips.map(tup => (tup, (stations.filter(_._1==tup(4).toInt).first._2), (stations.filter(_._1==tup(7).toInt).first._2)))
joinedtrips.take(5).foreach(pjoined)

倒数第二行失败,并显示以下错误:

  

org.apache.spark.SparkException:RDD转换和操作只能由驱动程序调用,而不能在其他转换内部调用。例如,rdd1.map(x => rdd2.values.count()* x)无效,因为无法在rdd1.map转换内部执行值转换和计数操作。

实现这一目标的正确而有效的方法是什么?

stations.csv:

station_id,name,lat,long,dockcount,landmark,installation,notes
2,San Jose Diridon Caltrain Station,37.329732,-121.901782,27,San Jose,8/6/2013,
3,San Jose Civic Center,37.330698,-121.888979,15,San Jose,8/5/2013,
...

trips.csv:

Trip ID,Duration,Start Date,Start Station,Start Terminal,End Date,End Station,End Terminal,Bike #,Subscription Type,Zip Code
4258,114,8/29/2013 11:33,San Jose City Hall,10,8/29/2013 11:35,MLK Library,11,107,Subscriber,95060
4265,151,8/29/2013 11:40,San Francisco City Hall,58,8/29/2013 11:42,San Francisco City Hall,58,520,Subscriber,94110
...
stations.csv中的

station_id要与trips.csv中的Start Terminal(索引4)和End Terminal(索引7)相匹配

1 个答案:

答案 0 :(得分:0)

两种方法。另外,请阅读Shaido的注释以使用Dataframe。

val bcStations = sc.broadcast(stations.collectAsMap)

val joined = trips.map(trip =>{
    (trip, bcStations.value.getOrElse(trip(4).toInt, Nil), bcStations.value.getOrElse(trip(7).toInt, Nil))
})

println(joined.toDebugString)

joined.take(1)

val mapStations = stations.collectAsMap

val joinedtrips = trips.map(trip => {
    (trip, mapStations.getOrElse(trip(4).toInt, Nil), mapStations.getOrElse(trip(7).toInt, Nil))
})

joinedtrips.take(1)