DSE 4.6到DSE 4.7无法找到Spark程序集

时间:2015-05-27 09:14:19

标签: apache-spark datastax datastax-enterprise spark-jobserver

将DSE 4.6升级到4.7后,我遇到了job-server-0.5.0的问题。如果我运行server_start.sh我会收到错误 “无法在/usr/share/dse/spark/assembly/target/scala-2.10中找到Spark程序集 你需要在运行这个程序之前构建Spark。“

我在/usr/share/dse/spark/bin/compute-classpath.sh中找到了

此代码引发错误

for f in ${assembly_folder}/spark-assembly*hadoop*.jar; do
  if [[ ! -e "$f" ]]; then
    echo "Failed to find Spark assembly in $assembly_folder" 1>&2
    echo "You need to build Spark before running this program." 1>&2
    exit 1
  fi
  ASSEMBLY_JAR="$f"
  num_jars=$((num_jars+1))
done

如果我运行/ usr / share / dse / spark / bin / spark-submit我会得到同样的错误。

1 个答案:

答案 0 :(得分:0)

如果您正在使用DSE,则最有可能在不触及compute-classpath的情况下启动spark-jobserver。您可以尝试修改启动脚本以使用dse spark-submit,如下例所示。

# job server jar needs to appear first so its deps take higher priority
# need to explicitly include app dir in classpath so logging configs can be found
#CLASSPATH="$appdir:$appdir/spark-job-server.jar:$($SPARK_HOME/bin/compute-classpath.sh)"

#exec java -cp $CLASSPATH $GC_OPTS $JAVA_OPTS $LOGGING_OPTS $CONFIG_OVERRIDES $MAIN $conffile 2>&1 &
dse spark-submit --class $MAIN $appdir/spark-job-server.jar --driver-java-options "$GC_OPTS $JAVA_OPTS $LOGGING_OPTS" $conffile 2>&1 &

https://github.com/spark-jobserver/spark-jobserver/blob/f5406a50406c59f26c878d7cee7334d6b9203312/bin/server_start.sh