为spark-submit

时间:2018-02-28 15:47:54

标签: java maven maven-plugin executable-jar spark-submit

我正在使用Java(而不是Scala)编写Spark应用程序。 类似的东西:

SparkConf conf = new SparkConf().setAppName("TEST");
JavaSparkContext sc = new JavaSparkContext(conf);
sc.setLogLevel("WARN");

我从简单的java项目入手,何时将项目编译成SPARK-SUBMIT的runnable jar,我只需右键单击该项目并导出如下图所示的pic1

pic1, export the runnable jar and extract req. libs

pic2, what it is like in the runnable jar with all libs

pic3, MANIFEST.MF file

它运作良好。

然后我尝试将其重组为Maven项目。我打算将项目打包到一个带有req的jar中。包含lib而不是嵌套。 以下是pom.xml中的一部分



<build>
    <finalName>${project.artifactId}</finalName>
    <resources>
        <resource>
            <directory>src/main/resources</directory>
            <excludes><exclude>**/dqci.properties</exclude></excludes>
        </resource>
    </resources>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.6.1</version>
            <configuration>
                <source>1.7</source>
                <target>1.7</target>
            </configuration>
        </plugin>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-assembly-plugin</artifactId>
            <version>2.5</version>
            <configuration>
                <archive>
                    <manifest>
                        <mainClass>com.walmart.china.mdmDqci.DQCIApplicatoin</mainClass>
                    </manifest>
                </archive>
                <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
                </descriptorRefs>
            </configuration>
            <executions>
                <execution>
                    <id>make-assembly</id>
                    <phase>package</phase>
                    <goals>
                        <goal>single</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
    </plugins>
  </build>
&#13;
&#13;
&#13;

pic4, what it looks like in the new jar

pic5, new MANIFEST.MF

然而,当我在同一个环境中运行jar时,异常显示如下:

&#13;
&#13;
java.lang.ClassNotFoundException: com.walmart.china.Main
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
        at org.apache.spark.util.Utils$.classForName(Utils.scala:175)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
&#13;
&#13;
&#13;

我的假设是我没有为我的项目使用正确的插件。有什么建议吗?提前谢谢。

0 个答案:

没有答案