我想使用下面提到的数据在spark-SQL中创建一个表。
[{
"empstr": "Blogspan",
"empbyte": 48,
"empshort": 457,
"empint": 935535,
"emplong": 36156987676070,
"empfloat": 6985.98,
"empdoub": 6392455.0,
"empdec": 0.447,
"empbool": 0,
"empdate": "09/29/2018",
"emptime": "2018-03-24 12:56:26"
}, {
"empstr": "Lazzy",
"empbyte": 9,
"empshort": 460,
"empint": 997408,
"emplong": 37564196351623,
"empfloat": 7464.75,
"empdoub": 5805694.86,
"empdec": 0.303,
"empbool": 1,
"empdate": "08/14/2018",
"emptime": "2018-06-17 18:31:15"
}]
但是,当我尝试查看打印模式时,显示的是corruped_redord。 所以,请问任何人都可以帮助我,如何在JAVA-spark 2.1.1中读取嵌套的JSON记录。 我将在下面附加我的代码
case "readjson":
tempTable = hiveContext.read().json(hiveContext.sparkContext().wholeTextFiles("1.json", 0));
/*In above line i am getting error at .json says
The method json(String...) in the type DataFrameReader is not applicable for the arguments (RDD<Tuple2<String,String>>)
//tempTable = hiveContext.read().json(componentBean.getHdfsPath());
tempTable.printSchema();
tempTable.show();
tempTable.createOrReplaceTempView(componentKey);
break;
答案 0 :(得分:2)
似乎您在使用API的哪些部分时遇到问题。
您需要记住 SparkContext
!= JavaSparkContext
。
这意味着您需要从活动的JavaSparkContext
创建一个SparkSession
对象:
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SQLContext;
import org.apache.spark.sql.SparkSession;
// [...]
SparkSession session = SparkSession.builder().getOrCreate();
SQLContext hiveContext = session.sqlContext();
JavaSparkContext sc = JavaSparkContext.fromSparkContext(session.sparkContext());
JavaRDD<String> jsonRDD = sc.wholeTextFiles("path/to/data", 2).values();
Dataset<Row> jsonDataset = hiveContext.read().json(jsonRDD);
jsonDataset.show();
// +-------+-------+----------+------+----------+--------+------+--------------+--------+--------+-------------------+
// |empbool|empbyte| empdate|empdec| empdoub|empfloat|empint| emplong|empshort| empstr| emptime|
// +-------+-------+----------+------+----------+--------+------+--------------+--------+--------+-------------------+
// | 0| 48|09/29/2018| 0.447| 6392455.0| 6985.98|935535|36156987676070| 457|Blogspan|2018-03-24 12:56:26|
// | 1| 9|08/14/2018| 0.303|5805694.86| 7464.75|997408|37564196351623| 460| Lazzy|2018-06-17 18:31:15|
// +-------+-------+----------+------+----------+--------+------+--------------+--------+--------+-------------------+
我希望这会有所帮助。