无法在DCOS上安装apache spark

时间:2017-07-28 07:57:48

标签: apache-spark dcos

我在OpenStack上安装了DC / OS VERSION 1.9.2。 我尝试在DC / OS上安装apache spark。

dcos package install spark
Installing Marathon app for package [spark] version [1.1.0-2.1.1]
Installing CLI subcommand for package [spark] version [1.1.0-2.1.1]
New command available: dcos spark
DC/OS Spark is being installed!

但是DC / OS仪表板显示Spark正在部署,而任务没有运行。 错误显示了这些消息。

I0728 16:43:36.348244 14038 exec.cpp:162] Version: 1.2.2
I0728 16:43:36.656839 14046 exec.cpp:237] Executor registered on agent abf187f4-ad7d-4ead-9437-5cdba4f77bdc-S1
+ export DISPATCHER_PORT=24238
+ DISPATCHER_PORT=24238
+ export DISPATCHER_UI_PORT=24239
+ DISPATCHER_UI_PORT=24239
+ export SPARK_PROXY_PORT=24240
+ SPARK_PROXY_PORT=24240
+ SCHEME=http
+ OTHER_SCHEME=https
+ [[ '' == true ]]
+ export DISPATCHER_UI_WEB_PROXY_BASE=/service/spark
+ DISPATCHER_UI_WEB_PROXY_BASE=/service/spark
+ grep -v '#https#' /etc/nginx/conf.d/spark.conf.template
+ sed s,#http#,,
+ sed -i 's,<PORT>,24240,' /etc/nginx/conf.d/spark.conf
+ sed -i 's,<DISPATCHER_URL>,http://172.16.129.180:24238,' /etc/nginx/conf.d/spark.conf
+ sed -i 's,<DISPATCHER_UI_URL>,http://172.16.129.180:24239,' /etc/nginx/conf.d/spark.conf
+ sed -i 's,<PROTOCOL>,,' /etc/nginx/conf.d/spark.conf
+ [[ '' == true ]]
+ [[ -f hdfs-site.xml ]]
+ [[ -n '' ]]
+ exec runsvdir -P /etc/service
+ + mkdirmkdir -p -p /mnt/mesos/sandbox/nginx /mnt/mesos/sandbox/spark

+ exec
+ exec svlogd /mnt/mesos/sandbox/nginx
+ exec svlogd /mnt/mesos/sandbox/spark
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32

如何在DCOS上运行Spark任务。 谢谢。

1 个答案:

答案 0 :(得分:0)

检查您是否正确卸载了以前的火花安装。你必须删除旧的zookeeper spark条目(在/ Exhibitor下)。

还要检查mesos中是否没有阻止新部署资源的僵尸框架。杀死他们:

http://MESOSMASTER_URL:5050/master/teardown -d 'frameworkId='<FRAMEWORKID>''