在查询抛出异常时,在hive外部表中压缩Json数据?

时间:2016-06-14 13:43:45

标签: hadoop hive cloudera hiveql cloudera-quickstart-vm

我已按照以下步骤

创建了外部表格
Hive >  ADD JAR /usr/lib/hive/lib/hive-serdes-1.0-SNAPSHOT.jar;
Hive > set hive.exec.compress.output=true;
Hive > set mapred.output.compress=true;
Hive> set mapred.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec;
Hive> set io.compression.codecs=org.apache.hadoop.io.compress.GzipCodec;

Hive >  CREATE EXTERNAL TABLE Json (id BIGINT,created_at STRING,source STRING,favorited BOOLEAN) ROW FORMAT SERDE  "com.cloudera.hive.serde.JSONSerDe"
LOCATION  /user/cloudera/ jsonGZ ";

我通过执行以下命令

压缩了我的Json文件
“ hadoop jar /usr/lib/hadoop-mapreduce/hadoop-streaming-2.6.0-cdh5.5.0.jar -Dmap.output.compress=true  -Dmap.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec -Dmapreduce.output.fileoutputformat.compress=true -Dmapreduce.output.fileoutputformat.compress.codec=org.apache.hadoop.io.compress.GzipCodec -input /user/cloudera/json/ -output /user/cloudera/jsonGZ “

然后,当我正在运行“ select * from json; “时,我收到以下错误:

“OK Failed with exception java.io.IOException:org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.map.JsonMappingException: Can not deserialize instance of java.util.LinkedHashMap out of VALUE_NUMBER_INT token at “ 

我还使用“org.apache.hive.hcatalog.data.JsonSerD”创建了一个表

Hive >  ADD JAR /usr/lib/hive-hactalog/share/ hactalog/ hive-hactalog-core.jar;

Hive > CREATE EXTERNAL TABLE Json 1(id BIGINT,created_at STRING,source STRING,favorited BOOLEAN) ROW FORMAT SERDE  "com.cloudera.hive.serde.JSONSerDe"
LOCATION  /user/cloudera/ jsonGZ ";

然后当我运行“select * from json1;“时,我收到以下错误:

Failed with exception java.io.IOException:org.apache.hadoop.hive.serde2.SerDeException: java.io.IOException: Start token not found where expected" after using "org.apache.hive.hcatalog.core(hive-hcatalog-core-0.13.0.jar)"

我错过了什么吗?我该如何解决这个错误。

1 个答案:

答案 0 :(得分:0)

只需将您的文件gzip并按原样(* .gz)放入表格位置。

gzip文件名