为什么Java提交的作业失败了?

时间:2015-11-14 04:42:39

标签: spring hadoop apache-spark bigdata yarn

我从Java提交Spark作业作为RESTful服务。我一直收到以下错误:

  

应用程序application_1446816503326_0098因AM而失败了2次   appattempt_1446816503326_0098_000002的容器已退出   exitCode:-1000要获得更详细的输出,请检查应用程序跟踪   页:http://ip-172-31-34-   108.us - 西2.compute.internal:8088 /代理/ application_1446816503326_0098 /然后,   单击每个尝试的日志链接。诊断:   java.io.FileNotFoundException:文件文件:/ opt / apache-tomcat-   8.0.28 / webapps / RESTfulExample / WEB-INF / lib / spark-yarn_2.10-1.3.0.jar不存在失败此尝试失败。申请失败。

spark-yarn_2.10-1.3.0.jar文件位于lib文件夹中。

这是我的计划。

package SparkSubmitJava;

import org.apache.spark.deploy.yarn.Client;
import org.apache.spark.deploy.yarn.ClientArguments;
import org.apache.hadoop.conf.Configuration;
import org.apache.spark.SparkConf;
import java.io.IOException;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import javax.ws.rs.core.Response;

@Path("/spark")
public class JavaRestService {

@GET
@Path("/{param}/{param2}/{param3}")
public Response getMsg(@PathParam("param") String bedroom,@PathParam("param2") String bathroom,@PathParam("param3")String area) throws IOException {
String[] args = new String[] {
"--name",
"JavaRestService",
    "--driver-memory",
    "1000M",     
    "--jar",
    "/opt/apache-tomcat-8.0.28/webapps/scalatest-0.0.1-SNAPSHOT.jar",
    "--class",
    "ScalaTest.ScalaTest.ScalaTest",
    "--arg",
    bedroom,
    "--arg",
    bathroom,
    "--arg",
    area,
    "--arg",
    "yarn-cluster",    
};
Configuration config = new Configuration();
System.setProperty("SPARK_YARN_MODE", "true");
SparkConf sparkConf = new SparkConf();
ClientArguments cArgs = new ClientArguments(args, sparkConf); 
Client client = new Client(cArgs, config, sparkConf); 
client.run(); 
return Response.status(200).entity(client).build();
}
}

任何帮助将不胜感激。

0 个答案:

没有答案