Pyspark-将数据保存到Hive表时出现错误“未解析的运算符'InsertIntoTable HiveTableRelation'”

时间:2019-04-25 10:57:45

标签: hadoop hive pyspark

我使用以下内容:

  • pyspark库,版本2.3.1
  • python,版本2.7.1
  • hadoop,版本2.7.3
  • 配置单元版本1.2.1000.2.6.5.30-1
  • 第2版火花

我的配置单元表如下:

CREATE TABLE IF NOT EXISTS my_database.my_table
(
    division STRING COMMENT 'Sample column'
)

我想使用pyspark将数据保存到HIVE中。我使用以下代码:

spark_session = SparkSession.builder.getOrCreate()
hive_context = HiveContext(spark_session.sparkContext)
hive_table_schema = hive_context.table("my_database.my_table").schema
df_to_save = spark_session.createDataFrame([["a"],["b"],["c"]], schema=hive_table_schema)
df_to_save.write.mode("append").insertInto("my_database.my_table")

但是发生以下错误:

Traceback (most recent call last):
  File "/home/my_user/mantis service_quality_check__global/scripts/row_counts_preprocess.py", line 147, in <module> df_to_save.write.mode("append").insertInto(hive_table_row_counts_str)
  File "/usr/hdp/current/spark2-client/python/lib/pyspark.zip/pyspark/sql/readwriter.py", line 716, in insertInto
  File "/usr/hdp/current/spark2-client/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py", line 1160, in __call__
  File "/usr/hdp/current/spark2-client/python/lib/pyspark.zip/pyspark/sql/utils.py", line 69, in deco
  pyspark.sql.utils.AnalysisException: u"unresolved operator 'InsertIntoTable HiveTableRelation `my_database`.`my_table`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [division#14], false, false;;\n'InsertIntoTable HiveTableRelation `my_database`.`my_table`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [division#14], false, false\n+- LogicalRDD [division#2], false\n"

请有人帮忙吗?这几天我被困住了

1 个答案:

答案 0 :(得分:0)

我发现了问题。 SparkSession必须支持配置单元。创建spark会话时必须调用enableHiveSupport()方法。

然后创建Spark会话如下

spark_session = SparkSession.builder.enableHiveSupport().getOrCreate()