使用远程jvm在纱线集群上提交火花作业时出现异常

时间:2015-07-30 04:28:13

标签: apache-spark yarn apache-spark-sql

我使用下面的java代码在yarn-cluster上提交作业。

public ApplicationId submitQuery(String requestId, String query,String fileLocations) {
    String driverJar = getDriverJar();
    String driverClass =  propertyService.getAppPropertyValue(TypeString.QUERY_DRIVER_CLASS);
    String driverAppName = propertyService.getAppPropertyValue(TypeString.DRIVER_APP_NAME);
    String extraJarsNeeded = propertyService.getAppPropertyValue(TypeString.DRIVER_EXTRA_JARS_NEEDED);

      String[] args = new String[] {
               // the name of your application
               "--name",
               driverAppName,

               // memory for driver (optional)
               "--driver-memory",
               "1000M",

               // path to your application's JAR file 
               // required in yarn-cluster mode      
               "--jar",
               "local:/home/ankit/Repository/Personalization/rtis/Cust360QueryDriver/target/SnapdealCustomer360QueryDriver-jar-with-selective-dependencies.jar",

               "--addJars",
               "local:/home/ankit/Downloads/lib/spark-assembly-1.3.1-hadoop2.4.0.jar,local:/home/ankit/.m2/repository/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar,local:/home/ankit/.m2/repository/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar",


               // name of your application's main class (required)
               "--class",
               driverClass,

               "--arg",
               requestId,

               "--arg",
               query,

               "--arg",
               fileLocations,

               "--arg",
               "yarn-client"
           };

      System.setProperty("HADOOP_CONF_DIR", "/home/hduser/hadoop-2.7.0/etc/hadoop");
      Configuration config = new Configuration();
      config.set("yarn.resourcemanager.address", propertyService.getAppPropertyValue(TypeString.RESOURCE_MANGER_URL));
      config.set("fs.default.name", propertyService.getAppPropertyValue(TypeString.FS_DEFAULT_NAME));

      System.setProperty("SPARK_YARN_MODE", "true");



      SparkConf sparkConf = new SparkConf();

      ClientArguments cArgs = new ClientArguments(args, sparkConf); 

      // create an instance of yarn Client client
       Client client = new Client(cArgs, config, sparkConf);

       ApplicationId id = client.submitApplication();

       return id;

}

作业正在提交给纱线群集,我可以检索应用程序ID,但是我在火花群集上运行作业时遇到异常。

 Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 13 more

虽然在/home/ankit/Downloads/lib/spark-assembly-1.3.1-hadoop2.4.0.jar中提到了类。看起来像--addJars中提到的jar没有被添加到驱动程序的spark上下文中。

我做错了什么?任何帮助将不胜感激。

2 个答案:

答案 0 :(得分:0)

您在Cloudera的发行版上进行部署吗? CDH 5.4配置中的spark.yarn.jar有一个' local:'本地文件的前缀,但Spark版本> = 1.5不喜欢这样,您应该只使用spark程序集的完整路径名。另请参阅here

答案 1 :(得分:0)

尝试构建没有spark依赖的JAR,并在spark提交中使用--jars传递依赖的jar。大多数时候ClassNotFoundException是由于spark而应用程序本身依赖于同一个jar。 建议的解决方案:

  1. 没有依赖的包,并在--jars期间添加依赖的jar spark-submit
  2. 修改应用程序以使用相同版本的第三方 图书馆就像火花一样
  3. 在构建工具中使用着色