使用java在spark sql中读取复杂的json

时间:2016-04-06 09:13:08

标签: java spark-streaming

我的json文件如下所示,我正在尝试使用以下代码读取majorsector_percent下的所有名称。

代码:

  JavaSQLContext sQLContext = new JavaSQLContext(sc);
    sQLContext.jsonFile("C:/Users/HimanshuK/Downloads/world_bank/world_bank.json").registerTempTable("logs");
    sQLContext.sqlContext().cacheTable("logs");
    List s = sQLContext.sql("select majorsector_percent from logs limit 1 ").map(row -> new Tuple2<>(row.getString(0), row.getString(1))).collect();


   JSON FIle 

     { "_id" : { "$oid" : "52b213b38594d8a2be17c780" }, "approvalfy" : 1999, "board_approval_month" : "November", "boardapprovaldate" : "2013-11-12T00:00:00Z", "borrower" : "FEDERAL DEMOCRATIC REPUBLIC OF ETHIOPIA", "closingdate" : "2018-07-07T00:00:00Z", "country_namecode" : "Federal Democratic Republic of Ethiopia!$!ET", "countrycode" : "ET", "countryname" : "Federal Democratic Republic of Ethiopia", "countryshortname" : "Ethiopia", "docty" : "Project Information Document,Indigenous Peoples Plan,Project Information Document", "envassesmentcategorycode" : "C", "grantamt" : 0, "ibrdcommamt" : 0, "id" : "P129828", "idacommamt" : 130000000, "impagency" : "MINISTRY OF EDUCATION", "lendinginstr" : "Investment Project Financing", "lendinginstrtype" : "IN", "lendprojectcost" : 550000000, "majorsector_percent" : [ { "Name" : "Education", "Percent" : 46 }, { "Name" : "Education", "Percent" : 26 }, { "Name" : "Public Administration, Law, and Justice", "Percent" : 16 }, { "Name" : "Education", "Percent" : 12 } ], "mjtheme" : [ "Human development" ], "mjtheme_namecode" : [ { "name" : "Human development", "code" : "8" }, { "name" : "", "code" : "11" } ], "mjthemecode" : "8,11", "prodline" : "PE", "prodlinetext" : "IBRD/IDA", "productlinetype" : "L", "project_abstract" : { "cdata" : "The development  }, "project_name" : "Ethiopia General Education Quality Improvement Project II",  "projectfinancialtype" : "IDA", "projectstatusdisplay" : "Active", "regionname" : "Africa", "sector1" : { "Name" : "Primary education", "Percent" : 46 }, "sector2" : { "Name" : "Secondary education", "Percent" : 26 }, "sector3" : { "Name" : "Public administration- Other social services", "Percent" : 16 }, "sector4" : { "Name" : "Tertiary education", "Percent" : 12 }, "sectorcode" : "ET,BS,ES,EP", "source" : "IBRD", "status" : "Active", "supplementprojectflg" : "N", "theme1" : { "Name" : "Education for all", "Percent" : 100 }, "themecode" : "65", "totalamt" : 130000000, "totalcommamt" : 130000000, "url" : "http://www.worldbank.org/projects/P129828/ethiopia-general-education-quality-improvement-project-ii?lang=en" }

但由于类型转换,如何处理此类案例以及如何了解架构,我收到此错误:

java.lang.ClassCastException:scala.collection.mutable.ArrayBuffer无法强制转换为java.lang.String

1 个答案:

答案 0 :(得分:0)

问题是该查询的结果是包含数组的结构。当您尝试使用row.getString(1)在数组上映射结果时,它会失败并显示CastException,因为相应的对象不是String。

SQL Query的结果是DataFrame,您可以要求这样的架构(您可以在Java API中使用相同的命令):

scala> val data = sqlContext.sql("select majorsector_percent from logs limit 1 ")
data: org.apache.spark.sql.DataFrame = [majorsector_percent: array<struct<Name:string,Percent:bigint>>]

scala> data.printSchema
root
 |-- majorsector_percent: array (nullable = true)
 |    |-- element: struct (containsNull = true)
 |    |    |-- Name: string (nullable = true)
 |    |    |-- Percent: long (nullable = true)

您可以从结果数据框中提取所需的信息,如下所示:

data.select("majorsector_percent.Name","majorsector_percent.Percent")

scala> data.select("majorsector_percent.Name","majorsector_percent.Percent").collect
res4: Array[org.apache.spark.sql.Row] = Array([WrappedArray(Education, Education, Public Administration, Law, and Justice, Education),WrappedArray(46, 26, 16, 12)])

或者您可以通过使用更具体的查询来简化流程:

val directQuery = sqlContext.sql("select majorsector_percent.Name, majorsector_percent.Percent from logs limit 1 ")
directQuery: org.apache.spark.sql.DataFrame = [Name: array<string>, Percent: array<bigint>]

scala> directQuery.collect
res5: Array[org.apache.spark.sql.Row] = Array([WrappedArray(Education, Education, Public Administration, Law, and Justice, Education),WrappedArray(46, 26, 16, 12)])