Web应用程序无法使用SparkLauncher启动Spark作业

时间:2016-01-20 03:42:16

标签: apache-spark spark-launcher

我想从Javaee Web应用程序启动Spark作业,但SparkLauncher毫无例外地启动,并且Spark集群上没有启动任务。有人可以帮忙吗?

public static void runJob(String userId) throws Exception {
    long previous = System.currentTimeMillis();
    logger.info("initialize spark context...");
    init("spark-cluster-test.properties"); 
    Process spark = new SparkLauncher()
            .setSparkHome(spark_home)
            .setMaster(spark_master)
            .setAppName(app_name+timestamp)
            .setAppResource(jar_file)
            .setMainClass(main_class)
            .setConf(SparkLauncher.DRIVER_MEMORY, spark_driver_memory)
            .setConf("spark.network.timeout",spark_network_timeout)
            .setConf(SparkLauncher.DRIVER_EXTRA_CLASSPATH, class_path)
            .setConf(SparkLauncher.EXECUTOR_EXTRA_CLASSPATH, class_path)
            .setDeployMode(spark_deploy_mode)
            .addAppArgs(userId)
            .launch();
    spark.waitFor();
    logger.info("spark job is returned after " + (System.currentTimeMillis() - previous) + " miliseconds.");
}

Web服务器的日志:

2016-01-20 10:27:17 -67129627 [http-bio-8088-exec-127] INFO    - SparkJobLauncher is started with userId=102
2016-01-20 10:27:17 -67129628 [http-bio-8088-exec-127] INFO    - initialize spark context...
2016-01-20 10:27:19 -67130838 [http-bio-8088-exec-127] INFO    - spark job is returned after 1 seconds.
2016-01-20 10:27:19 -67130839 [http-bio-8088-exec-127] DEBUG   - Creating a new SqlSession

但是没有在Spark集群上启动任务。并且在调用之后立即返回spark.waitFor()方法,该方法应该运行几秒钟。

0 个答案:

没有答案