HiveException:无法创建spark客户端

时间:2016-08-24 14:36:01

标签: apache-spark hivecontext

1)我创建了一个sql文件,我们从两个不同的hive表中收集数据并插入到一个Hive表中,

2)我们使用shell脚本

调用此SQL文件

3)样本火花设置:

SET hive.execution.engine=spark;
SET spark.master=yarn-cluster;
SET spark.app.name="ABC_${hiveconf:PRC_DT}_${hiveconf:JOB_ID}";
--SET spark.driver.memory=8g;
--SET spark.executor.memory=8g;
SET hive.exec.dynamic.partition.mode = nonstrict;
SET hive.stats.fetch.column.stats=true;
SET hive.optimize.index.filter=true;
Set hive.map.aggr=true;
Set hive.exec.parallel=true;SET spark.executor.cores=5;
SET hive.prewarm.enabled=true;
SET hive.spark.client.future.timeout=900;
SET hive.spark.client.server.connect.timeout=100000;

4)示例Hive查询:

insert OVERWRITE table ABC (a,b,c) select * from XYZ
from ${hiveconf:SCHEMA_NAME}.${hiveconf:TABLE_NAME}
where JOB_ID = '${hiveconf:JOB_ID}'

5)示例脚本:

hive -f $PARENTDIR/sql/test.sql --hiveconf SCHEMA_NAME=ABC --hiveconf TABLE_NAME=AB1 --hiveconf PRC_DT=${PRC_DT} --hiveconf JOB_ID=${JOB_ID} 
hive -f $PARENTDIR/sql/test.sql --hiveconf SCHEMA_NAME=ABC --hiveconf TABLE_NAME=AB2 --hiveconf PRC_DT=${PRC_DT} --hiveconf JOB_ID=${JOB_ID} 

Error:
2016-08-24 17:30:05,651 WARN  [main] mapreduce.TableMapReduceUtil: The hbase-prefix-tree module jar containing PrefixTreeCodec is not present.  Continuing without it.

Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/jars/hive-common-1.1.0-cdh5.7.2.jar!/hive-log4j.properties
FAILED: SemanticException Failed to get a spark session: org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client.

1 个答案:

答案 0 :(得分:1)

错误,因为您在超时之前没有分配ApplicationMaster。增加以下参数(默认为90000ms,将其设置为100000ms以上):

set hive.spark.client.server.connect.timeout=300000ms;