我在将hive查询作为spark引擎执行时遇到错误。
Error:
Failed to execute spark task, with exception org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
Hive Console:
hive> set hive.execution.engine=spark;
hive> set spark.master=spark://INBBRDSSVM15.example.com:7077;
hive> set spark.executor.memory=2g;
Hadoop - 2.7.0
Hive - 1.2.1
Spark - 1.6.1
答案 0 :(得分:2)
YARN容器内存小于Spark Executor 需求。我设置了YARN Container内存和最大值 大于Spark Executor Memory + Overhead。校验 ' yarn.scheduler.maximum分配-MB'和/或 ' yarn.nodemanager.resource.memory-MB'