Spark Controller安装失败,通过ambari

时间:2016-12-15 14:43:21

标签: hadoop apache-spark failed-installation

当我们尝试通过Ambari安装Spark Controller时,它会给出错误。

以下是我们得到的错误:

stderr:/var/lib/ambari-agent/data/errors-403.txt

  

文件   " /var/lib/ambari-agent/cache/stacks/HDP/2.3/services/SparkController/package/scripts/controller_conf.py" ;,   第10行,在controller_conf中递归= True

     

文件   " /usr/lib/python2.6/site-packages/resource_management/core/base.py" ;,   第147行,在 init 中提升失败("%s收到不支持的参数%s"   %(self,key))resource_management.core.exceptions.Fail:   目录[' / usr / sap / spark / controller / conf']收到不受支持   参数递归

stdout:/var/lib/ambari-agent/data/output-403.txt

2016-12-15 08:44:36,441 - Skipping installation of existing package curl
2016-12-15 08:44:36,441 - Package['hdp-select']   {'retry_on_repo_unavailability': False, 'retry_count': 5} 
2016-12-15 08:44:36,496 - Skipping installation of existing package hdp-select Start installing 
2016-12-15 08:44:36,668 - Execute['cp -r /var/lib/ambari-agent/cache/stacks/HDP/2.3/services/SparkController/package/files/sap/spark /usr/sap'] {} 
2016-12-15 08:44:36,685 - Execute['chown hanaes:sapsys /var/log/hanaes'] {} Configuring... Command failed after 1 tries

版本:

Ambari : 2.4.2.0   
Spark : 1.5.2.2.3   
Spark Controller : 1.6.1

1 个答案:

答案 0 :(得分:0)

向SAP提出了客户消息,其解决方案是:" Spark Controller 1.6.2的已知问题,请升级到Spark Controller 2.0"。

升级到Spark Controller 2.0后,安装成功。因此关闭这个帖子。