Spark应用程序以编程方式提交时参数解析期间出错

时间:2016-08-26 19:57:14

标签: java apache-spark yarn

我想以编程方式将Java Spark应用程序提交给YARN(在Java中,而不是在Scala中)。当我尝试这样做时(我的代码):

package application.RestApplication;

import org.apache.hadoop.conf.Configuration;
import org.apache.spark.SparkConf;
import org.apache.spark.deploy.yarn.Client;
import org.apache.spark.deploy.yarn.ClientArguments;

public class App {
    public static void main(String[] args1) {
        String[] args = new String[] {
                "--class", "org.apache.spark.examples.JavaWordCount",
                // "--deploy-mode", "cluster",
                // "--master", "yarn",
                // "--driver-memory", "3g",
                // "--executor-memory", "3g",
                "--jar", "/opt/spark/examples/jars/spark-examples_2.11-2.0.0.jar",
                "--arg", "hdfs://hadoop-master:9000/input/file.txt"
        };
        Configuration config = new Configuration();
        System.setProperty("SPARK_YARN_MODE", "true");
        SparkConf sparkConf = new SparkConf();
        ClientArguments cArgs = new ClientArguments(args);
        Client client = new Client(cArgs, config, sparkConf);
        client.run();
    }
}

我收到了错误:ClientArguments cArgs = new ClientArguments(args);

Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object;
    at org.apache.spark.deploy.yarn.ClientArguments.parseArgs(ClientArguments.scala:38)
    at org.apache.spark.deploy.yarn.ClientArguments.<init>(ClientArguments.scala:31)
    at application.RestApplication.App.main(App.java:37)

这是解析String [] args的问题 - 当数组为空时程序启动(但没有参数=没有工作)。当我输入正确的参数(如上所述)或不正确时(例如&#34; - foo&#34;,&#34; foo&#34;)我有同样的错误。我该如何解决?

1 个答案:

答案 0 :(得分:0)

Similar question,可能您的机器中出现了错误的Scala版本。

顺便说一下,你可以查看Livy,这应该对你的工作有益:)