我正在通过Java Web应用程序中的spark启动器提交/运行多个应用程序。似乎只提交了一个应用。这是我的代码
Runnable run = new Runnable() {
public void run() {
try {
SparkAppHandle sparkApp = new SparkLauncher()
.setAppResource("C:\\BigData\\spark\\examples\\jars\\spark-examples_2.11-2.4.0.jar")
.setMainClass("org.apache.spark.examples.SparkPi")
.setMaster("spark://192.168.2.233:7077")
.setConf("spark.scheduler.mode", "FAIR")
.setVerbose(true)
.setConf("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
.setConf("spark.sql.inMemoryColumnarStorage.batchSize", "10000")
.setConf("spark.sql.codegen","false")
.setConf("spark.submit.deployMode", "client")
.setConf("spark.executor.memory", "1g")
.setConf("spark.driver.memory", "1g")
.setConf("spark.cores.max", "1")
.setConf("spark.executor.cores", "1")
.setConf("spark.executor.instances", "1")
.setConf("spark.driver.host","192.168.2.233")
// .setConf("spark.dynamicAllocation.enabled", "true")
// .setConf("spark.shuffle.service.enabled", "true")
.startApplication();
System.out.println(sparkApp.getState());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
};
//running two times so as to submit two parallel application
//in the application logic , we would pass different args to different app
new Thread(run).start();
new Thread(run).start();`
我有一个独立的群集,其中一个工作节点为1(8gb,4核),另一个工作节点为2(8gb,2核)。 Master在node1上运行,驱动程序也仅在node1上。
即使第二个线程启动应用程序似乎也没有任何反应,并且第二个应用程序甚至都没有出现在WAITING状态,这本来可以理解。