BSONObject的RDD到DataFrame

时间:2016-10-04 11:46:22

标签: json mongodb apache-spark bson

我正在按照here所述将Mongo的bson转储加载到Spark中。它有效,但我得到的是:

org.apache.spark.rdd.RDD[(Object, org.bson.BSONObject)]

它应该基本上只是所有String字段的JSON。我的其余代码需要一个DataFrame对象来操作数据。但是,当然toDF在该RDD上失败了。如何将其转换为包含String所有字段的Spark DataFrame?类似于spark.read.json的东西会很棒。

2 个答案:

答案 0 :(得分:0)

val datapath = "path_to_bson_file.bson" 

import org.apache.hadoop.conf.Configuration

// Set up the configuration for reading from bson dump.
val bsonConfig = new Configuration()
bsonConfig.set("mongo.job.input.format", "com.mongodb.hadoop.BSONFileInputFormat")

// given with your spark session 
implicit lazy val sparkSession = initSpark()

// read the RDD[org.bson.BSONObject]
val bson_data_as_json_string = sparkSession.sparkContext.newAPIHadoopFile(datapath,
  classOf[com.mongodb.hadoop.BSONFileInputFormat].
    asSubclass(classOf[org.apache.hadoop.mapreduce.lib.input.FileInputFormat[Object, org.bson.BSONObject]]),
  classOf[Object],
  classOf[org.bson.BSONObject],
  bsonConfig).
  map{row => {
    // map BSON object to JSON string
    val json = com.mongodb.util.JSON.serialize(row._2)
    json
  }
}

// read into JSON spark Dataset:
val bson_data_as_json_dataset = sparkSession.sqlContext.read.json(bson_data_as_json_string)
// eval the schema:
bson_data_as_json_dataset.printSchema()

答案 1 :(得分:0)

尝试以下代码

def parseData(s:String)={
val doc=org.bson.Document.parse(s)
val jsonDoc=com.mongodb.util.JSON.serialize(doc)
jsonDoc

val df=spark.read.json(spark.sparkContext.newAPIHadoopFile("src//main//resources//MyDummyData",classOf[BSONFileInputFormat].asSubclass(classOf[org.apache.hadoop.mapreduce.lib.input.FileInputFormat[Object,BSONObject]]), classOf[Object], classOf[BSONObject]).map(x=>x._2).map(x=>parseData(x.toString)))