Spark上的Hive无法正常工作 - 无法创建Spark客户端

时间:2016-08-01 09:06:11

标签: apache-spark hive apache-spark-sql

我在将hive查询作为spark引擎执行时遇到错误。

Error:
    Failed to execute spark task, with exception org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
    FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask

Hive Console:
    hive> set hive.execution.engine=spark;
    hive> set spark.master=spark://INBBRDSSVM15.example.com:7077;
    hive> set spark.executor.memory=2g;

    Hadoop - 2.7.0
    Hive - 1.2.1
    Spark - 1.6.1

1 个答案:

答案 0 :(得分:2)

  

YARN容器内存小于Spark Executor   需求。我设置了YARN Container内存和最大值   大于Spark Executor Memory + Overhead。校验   ' yarn.scheduler.maximum分配-MB'和/或   ' yarn.nodemanager.resource.memory-MB'

Please see Source here