我的问题与现有的thread
有关但我们使用的是HDP 2.6.3和Ambari 2.6.1.5
问题: 我们正试图从SPARK2.2
访问HIVE表数据命令:
spark-submit --class com.virtuslab.sparksql.MainClass --master yarn --deploy-mode client /tmp/spark-hive-test/spark_sql_under_the_hood-spark2.2.0.jar
在客户端模式下,它有效 - >请注意我们尚未通过 - 文件或 - conf spark.yarn.dist.files
spark-submit --class com.virtuslab.sparksql.MainClass --master yarn --deploy-mode cluster /tmp/spark-hive-test/spark_sql_under_the_hood-spark2.2.0.jar
在群集模式下,它失败并显示:
diagnostics: User class threw exception:
org.apache.spark.sql.catalyst.analysis.NoSuchTableException: Table or view
'xyz' not found in database 'qwerty';
ApplicationMaster host: 121.121.121.121
ApplicationMaster RPC port: 0
queue: default
start time: 1523616607943
final status: FAILED
tracking URL: https://managenode002xxserver:8090/proxy/application_1523374609937_10224/
user: abc123
Exception in thread "main" org.apache.spark.SparkException: Application
application_1523374609937_10224 finished with failed status
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1187)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1233)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:782)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
请注意我们没有使用 - 文件或 - conf spark.yarn.dist.files
但同样适用于此:
spark-submit --class com.virtuslab.sparksql.MainClass --master yarn --deploy-mode cluster --files /etc/spark2/conf/hive-site.xml /tmp/spark-hive-test/spark_sql_under_the_hood-spark2.2.0.jar
看到了结果
在YARN CLUSTER模式下运行时是否有任何BUG不允许SPARK不接收/ etc / spark2 / conf。
注意:/ etc / spark2 / conf在集群的所有节点上都包含hive-site.xml。