Oozie Spark Action java.lang.NoClassDefFoundError:org / apache / spark / Logging

时间:2017-01-13 16:02:48

标签: hadoop apache-spark cloudera oozie

我使用Oozie Spark Action和纱线群集模式在CDH 5.7.4上使用Kerberos启动我的Spark应用程序(版本:1.6.0)。我仍然收到错误:

Error: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 13 more

我无法理解的是,在前端节点上启动的相同应用程序运行良好。此外,我没有在我的应用程序中使用Logging类。

是否有可能,hdfs / user / oozie / share / lib / lib_ / spark中的spark shared lib具有与标称SPARK_HOME = / opt / cloudera / parcels / CDH-5.7.4-1不同的状态。 cdh5.7.4.p0.2 / LIB /火花?

这是我的workflow.xml文件:

<workflow-app name="myApp workflow" xmlns="uri:oozie:workflow:0.5">
<global>
        <configuration>
            <property>
                <name>oozie.launcher.yarn.app.mapreduce.am.env</name>
                <value>SPARK_HOME=/opt/cloudera/parcels/CDH-5.7.4-1.cdh5.7.4.p0.2/lib/spark</value>
            </property>
        </configuration>
</global>
  <credentials>
    <credential name="hcat" type="hcat">
      <property>
        <name>hcat.metastore.uri</name>
        <value>thrift://<XXX></value>
      </property>
      <property>
        <name>hcat.metastore.principal</name>
        <value>hive/<XXX></value>
      </property>
    </credential>
  </credentials>
    <start to="spark"/>
    <kill name="Kill">
        <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <action name="spark" cred="hcat">
        <spark xmlns="uri:oozie:spark-action:0.1">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <master>yarn</master>
            <mode>cluster</mode>
            <name>AppName</name>
              <class>main.Class</class>
            <jar>${nameNode}/<path>/myJar.jar</jar>
              <spark-opts> --files ${nameNode}/<path>/hive-site.xml, ${nameNode}/<path>/file.conf</spark-opts>
              <arg>--mode history</arg>
              <arg>--base-timestamp 2016-01-20#00:00:00</arg>
              <arg>--conf file.conf</arg>
        </spark>
        <ok to="End"/>
        <error to="Kill"/>
    </action>
    <end name="End"/>
</workflow-app>

对我遗失的内容有任何疑问?

0 个答案:

没有答案