Oozie Spark Job在插入动态覆盖到hive分区表时失败

时间:2018-06-10 15:23:50

标签: apache-spark hive

我正在尝试使用spark动态插入到分区的hive表中。我使用了以下代码:

String query = "insert overwrite table %s.%s partition(status_flag) select * from %s";
datasetIFlag().createOrReplaceTempView(metadata.getTargetStoreTableName() + "_tmp");
spark.sql(String.format(query, metadata.getTargetStoreDBName(), metadata.getTargetStoreTableName(), metadata.getTargetStoreTableName() + "_tmp"));

并且还确保status_flag是数据集的最后一列。 我使用spark 2.2.0和hive 1.1和CDH5.11。并且在创建spark会话时启用动态分区是非严格的。 关于以下例外的想法。 当spark试图运行查询时,我得到以下异常:

    18/06/11 00:49:18 INFO Hive: New loading path = hdfs://***/***/***/complete/hive/**/**/**/**/1528642150383/.hive-staging_hive_2018-06-11_00-49-16_284_2447412172536723108-1/-ext-10000/status_flag=I with partSpec {status_flag=I}
    18/06/11 00:49:19 INFO FileUtils: Creating directory if it doesn't exist: hdfs://***/***/***/complete/hive/**/**/**/**/1528642150383/status_flag=I
    18/06/11 00:49:19 ERROR ApplicationMaster: User class threw exception: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Exception when loading 1 in table test_table with loadPath=hdfs://***/***/***/complete/hive/**/**/**/**/1528642150383/.hive-staging_hive_2018-06-11_00-49-16_284_2447412172536723108-1/-ext-10000;
    org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Exception when loading 1 in table test_table with loadPath=hdfs://***/***/***/complete/hive/**/**/**/**/1528642150383/.hive-staging_hive_2018-06-11_00-49-16_284_2447412172536723108-1/-ext-10000;
            at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:108)
            at org.apache.spark.sql.hive.HiveExternalCatalog.loadDynamicPartitions(HiveExternalCatalog.scala:891)
            at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.run(InsertIntoHiveTable.scala:331)
            at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
            at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
            at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:67)
            at org.apache.spark.sql.Dataset.<init>(Dataset.scala:182)
            at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:67)
            at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623)
            at test.common.utils.HDFSUtility.writeIDRecords(HDFSUtility.java:353)
            at test.persist.impl.DataPersistServiceImpl.execute(DataPersistServiceImpl.java:140)
            at test.pipeline.SparkDataPersistPipeline.execute(SparkDataPersistPipeline.java:131)
            at test.pipeline.SparkDataPersistPipeline.main(SparkDataPersistPipeline.java:67)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:498)
            at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:686)
    Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Exception when loading 1 in table test_table with loadPath=hdfs://***/***/***/complete/hive/**/**/**/**/1528642150383/.hive-staging_hive_2018-06-11_00-49-16_284_2447412172536723108-1/-ext-10000
            at org.apache.hadoop.hive.ql.metadata.Hive.loadDynamicPartitions(Hive.java:1714)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:498)
            at org.apache.spark.sql.hive.client.Shim_v0_14.loadDynamicPartitions(HiveShim.scala:772)
            at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadDynamicPartitions$1.apply$mcV$sp(HiveClientImpl.scala:698)
            at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadDynamicPartitions$1.apply(HiveClientImpl.scala:696)
            at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadDynamicPartitions$1.apply(HiveClientImpl.scala:696)
            at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:275)
            at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:216)
            at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:215)
            at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:258)
            at org.apache.spark.sql.hive.client.HiveClientImpl.loadDynamicPartitions(HiveClientImpl.scala:696)
        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadDynamicPartitions$1.apply$mcV$sp(HiveExternalCatalog.scala:903)
        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadDynamicPartitions$1.apply(HiveExternalCatalog.scala:891)
        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadDynamicPartitions$1.apply(HiveExternalCatalog.scala:891)
        at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:99)
        ... 17 more
Caused by: java.util.concurrent.ExecutionException: java.lang.NoSuchMethodError: org.apache.hadoop.hdfs.client.HdfsAdmin.getKeyProvider()Lorg/apache/hadoop/crypto/key/KeyProvider;
        at java.util.concurrent.FutureTask.report(FutureTask.java:122)
        at java.util.concurrent.FutureTask.get(FutureTask.java:192)
        at org.apache.hadoop.hive.ql.metadata.Hive.loadDynamicPartitions(Hive.java:1706)
        ... 34 more
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hdfs.client.HdfsAdmin.getKeyProvider()Lorg/apache/hadoop/crypto/key/KeyProvider;
        at org.apache.hadoop.hive.shims.Hadoop23Shims$HdfsEncryptionShim.<init>(Hadoop23Shims.java:1265)
        at org.apache.hadoop.hive.shims.Hadoop23Shims.createHdfsEncryptionShim(Hadoop23Shims.java:1407)
        at org.apache.hadoop.hive.ql.session.SessionState.getHdfsEncryptionShim(SessionState.java:464)
        at org.apache.hadoop.hive.ql.metadata.Hive.needToCopy(Hive.java:2973)
        at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2874)
        at org.apache.hadoop.hive.ql.metadata.Hive.replaceFiles(Hive.java:3199)
        at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:1465)
        at org.apache.hadoop.hive.ql.metadata.Hive$2.call(Hive.java:1685)
        at org.apache.hadoop.hive.ql.metadata.Hive$2.call(Hive.java:1676)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

0 个答案:

没有答案