Cloudera cdh5.1.3使用服务向导错误添加SPARK 0.9错误未设置SCALA_HOME

时间:2015-04-16 10:27:50

标签: scala apache-spark yarn cloudera-cdh cloudera-manager

我在向CDH添加Spark时遇到以下错误,主节点和工作节点配置。我已按照此链接中的说明操作:https://docs.sigmoidanalytics.com/index.php/Installing_Spark_and_Setting_Up_Your_Cluster

要设置SCALA_HOME,它会给我以下错误:

Service did not start successfully; not all of the required roles started: Service has only 0 Master roles running instead of minimum required 1.
Program: csd/csd.sh ["start_worker","./master.properties"]
Program: csd/csd.sh ["start_master","./master.properties"]

++ CDH_HADOOP_BIN=/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/hadoop/bin/hadoop
++ export CDH_IMPALA_HOME=/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/impala
++ CDH_IMPALA_HOME=/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/impala
++ export CDH_SOLR_HOME=/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/solr
++ CDH_SOLR_HOME=/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/solr
++ export CDH_HBASE_INDEXER_HOME=/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/hbase-solr
++ CDH_HBASE_INDEXER_HOME=/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/hbase-solr
++ export SEARCH_HOME=/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/search
++ SEARCH_HOME=/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/search
++ export CDH_SPARK_HOME=/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/spark
++ CDH_SPARK_HOME=/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/spark
++ export WEBHCAT_DEFAULT_XML=/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ WEBHCAT_DEFAULT_XML=/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/etc/hive-webhcat/conf.dist/webhcat-default.xml
+ echo 'Using /var/run/cloudera-scm-agent/process/751-spark-SPARK_MASTER as conf dir'
+ echo 'Using scripts/control.sh as process script'
+ chmod u+x /var/run/cloudera-scm-agent/process/751-spark-SPARK_MASTER/scripts/control.sh
+ exec /var/run/cloudera-scm-agent/process/751-spark-SPARK_MASTER/scripts/control.sh start_master ./master.properties
Thu Apr 16 09:44:51 GMT 2015
Thu Apr 16 09:44:51 GMT 2015: Detected CDH_VERSION of [5]
Thu Apr 16 09:44:51 GMT 2015: Found a master on syseng03 listening on port 7077
Thu Apr 16 09:44:51 GMT 2015: Starting Spark master on syseng03 and port 7077
/app/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/spark/bin/compute-classpath.sh: line 65: hadoop: command not found
SCALA_HOME is not set

1 个答案:

答案 0 :(得分:0)

如果你清楚地检查日志消息,它说使用cloudera manager安装spark时没有正确设置classpath(正如CM预期的那样,它应该自动解决所有依赖关系,因为我从向导安装但不知何故它没有工作)...   我做的诀窍是编辑/opt/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/lib/spark/bin/compute-classpath.sh脚本并手动添加所有类路径信息。 ..以下行我添加"之前"必需的类路径变量

PATH=$PATH:/opt/cloudera/parcels/CDH/bin
export SCALA_HOME=/app/scala-2.10.4
export SCALA_LIBRARY_PATH=/app/scala-2.10.4/lib
CLASSPATH="$CLASSPATH:/app/opt/cloudera/parcels/CDH/lib/hive/lib/*"