java.lang.ClassCastException:org.apache.hadoop.hive.ql.io.orc.OrcStruct无法强制转换为org.apache.hadoop.io.Text。 json serde出错

时间:2017-07-15 22:55:24

标签: json hadoop hive hive-serde

我不熟悉在hive上使用json数据。我正在研究一个获取json数据并将其存储到hive表中的spark应用程序。我有一个像这样的json:

Json of Jsons

在展开时看起来像这样:

hierarchy

我能够将json读入数据帧并将其保存在HDFS上的某个位置。但是,获取能够读取数据的蜂巢是困难的部分。

例如,在我在线搜索之后,我尝试过这样做:

对所有json字段使用STRUCT,然后使用column.element访问元素。

例如:

web_app_security将是表格内的列(STRUCT类型)的名称,其中的其他jsons(如config_web_cms_authentication, web_threat_intel_alert_external)也将是Structs(ratingrating_numeric作为字段。)

我尝试用json serde创建表。这是我的表定义:

CREATE EXTERNAL TABLE jsons (
web_app_security struct<config_web_cms_authentication: struct<rating: string, rating_numeric: float>, web_threat_intel_alert_external: struct<rating: string, rating_numeric: float>, web_http_security_headers: struct<rating: string, rating_numeric: float>, rating: string, rating_numeric: float>,
dns_security struct<domain_hijacking_protection: struct<rating: string, rating_numeric: float>, rating: string, rating_numeric: float, dns_hosting_providers: struct<rating:string, rating_numeric: float>>,
email_security struct<rating: string, email_encryption_enabled: struct<rating: string, rating_numeric: float>, rating_numeric: float, email_hosting_providers: struct<rating: string, rating_numeric: float>, email_authentication: struct<rating: string, rating_numeric: float>>,
threat_intell struct<rating: string, threat_intel_alert_internal_3: struct<rating: string, rating_numeric: float>, threat_intel_alert_internal_1: struct<rating: string, rating_numeric: float>, rating_numeric: float,  threat_intel_alert_internal_12: struct<rating: string, rating_numeric: float>, threat_intel_alert_internal_6: struct<rating: string, rating_numeric: float>>,
data_loss struct<data_loss_6: struct<rating: string, rating_numeric: float>, rating: string, data_loss_36plus: struct<rating: string, rating_numeric: float>, rating_numeric: float,  data_loss_36: struct<rating: string, rating_numeric: float>, data_loss_12: struct<rating: string, rating_numeric: float>, data_loss_24: struct<rating: string, rating_numeric: float>>,
system_hosting struct<host_hosting_providers: struct<rating: string, rating_numeric: float>,  hosting_countries: struct<rating: string, rating_numeric: float>, rating: string, rating_numeric: float>,
defensibility struct<attack_surface_web_ip: struct<rating: string, rating_numeric: float>, shared_hosting: struct<rating: string, rating_numeric: float>, defensibility_hosting_providers: struct<rating: string, rating_numeric: float>, rating: string, rating_numeric: float, attack_surface_web_hostname: struct<rating: string, rating_numeric: float>>,
software_patching struct<patching_web_cms: struct<rating: string, rating_numeric: float>, rating: string, patching_web_server: struct<rating: string, rating_numeric: float>, patching_vuln_open_ssl: struct<rating: string, rating_numeric: float>, patching_app_server: struct<rating: string, rating_numeric: float>, rating_numeric: float>,
governance struct<governance_customer_base: struct<rating: string, rating_numeric: float>, governance_security_certifications: struct<rating: string, rating_numeric: float>, governance_regulatory_requirements: struct<rating: string, rating_numeric: float>, rating: string, rating_numeric: float>
)ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'
STORED AS orc
LOCATION 'hdfs://nameservice1/data/gis/final/rr_current_analysis'

我试图用json serde解析行。在我将一些数据保存到表后,当我尝试查询时出现以下错误:

Error: java.io.IOException: java.lang.ClassCastException: org.apache.hadoop.hive.ql.io.orc.OrcStruct cannot be cast to org.apache.hadoop.io.Text (state=,code=0)

我不确定我是否正确地做到了这一点。

我也欢迎将数据存储到表中的任何其他方式。任何帮助,将不胜感激。谢谢。

1 个答案:

答案 0 :(得分:1)

那是因为您将ORC混合为存储(STORED AS orc)而JSON作为SerDe(ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe')覆盖ORC的默认OrcSerde SerDe,但不是输入({{1} })和输出(OrcInputFormat)格式。

您需要使用ORC存储而不会覆盖其默认的SerDe。在这种情况下,请确保您的Spark应用程序写入ORC表,而不是JSON。

或者,如果您希望将数据存储在JSON中,请将OrcOutputFormat与纯文本文件一起用作存储(JsonSerDe)。

Hive开发人员指南解释了SerDe和存储如何工作 - https://cwiki.apache.org/confluence/display/Hive/DeveloperGuide#DeveloperGuide-HiveSerDe