Maven使用jenkins为scala spark程序构建:“没有要安装的主要工件,而是安装附加的工件

时间:2017-11-25 23:00:54

标签: scala maven apache-spark jenkins jenkins-pipeline

我有一个带有多个scala spark程序的项目,而我通过eclipse运行mvn install我能够获得正确的jar生成,它与spark-submit命令一起运行。

在将代码推送到GIT后,我们尝试使用jenkins构建它,因为我们希望使用ansible自动将jar文件推送到我们的hadoop集群。 我们将构建目标的jenkinsfile称为“compile package install -X”。

日志显示 -

[DEBUG](f)artifact = com.esi.rxhome:PROJECT1:jar:0.0.1-SNAPSHOT

[DEBUG](f) attachedArtifacts = [com.esi.rxhome:PROJECT1:jar:jar-   with-dependencies:0.0.1-SNAPSHOT, com.esi.rxhome:PROJECT1:jar:jar-with-dependencies:0.0.1-SNAPSHOT]

[DEBUG]   (f) createChecksum = false

[DEBUG]   (f) localRepository =       id: local
  url: file:///home/jenkins/.m2/repository/
layout: default
snapshots: [enabled => true, update => always]
releases: [enabled => true, update => always]

[DEBUG]   (f) packaging = jar

[DEBUG]   (f) pomFile = /opt/jenkins-data/workspace/ng_datahub-pipeline_develop-JYTJLDEXV65VZWDCZAXG5Y7SHBG2534GFEF3OF2WC4543G6ANZYA/pom.xml

[DEBUG]   (s) skip = false

[DEBUG]   (f) updateReleaseInfo = false

[DEBUG] -- end configuration --

[INFO] **No primary artifact to install, installing attached artifacts instead**

我在类似帖子中看到了错误 -

Maven: No primary artifact to install, installing attached artifacts instead

但在这里答案说 - 删除自动清理,我不知道如何在jenkins构建jar文件时停止它。

下面是pom.xml -

            <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
                xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
                <modelVersion>4.0.0</modelVersion>
                <groupId>com.esi.rxhome</groupId>
                <artifactId>PROJECT1</artifactId>
                <version>0.0.1-SNAPSHOT</version>
                <packaging>jar</packaging>
                <name>${project.artifactId}</name>
                <description>RxHomePreprocessing</description>
                <inceptionYear>2015</inceptionYear>
                <licenses>
                    <license>
                        <name>My License</name>
                        <url>http://....</url>
                        <distribution>repo</distribution>
                    </license>
                </licenses>

                <properties>
                    <maven.compiler.source>1.8</maven.compiler.source>
                    <maven.compiler.target>1.8</maven.compiler.target>
                    <encoding>UTF-8</encoding>
                    <scala.version>2.10.6</scala.version>
                    <scala.compat.version>2.10</scala.compat.version>
                </properties>

                <dependencies>
                    <dependency>
                        <groupId>org.scala-lang</groupId>
                        <artifactId>scala-library</artifactId>
                        <version>${scala.version}</version>
                    </dependency>

                    <!-- Test -->
                    <dependency>
                        <groupId>junit</groupId>
                        <artifactId>junit</artifactId>
                        <version>4.11</version>
                        <scope>test</scope>
                    </dependency>
                    <dependency>
                        <groupId>org.specs2</groupId>
                        <artifactId>specs2-core_${scala.compat.version}</artifactId>
                        <version>2.4.16</version>
                        <scope>test</scope>
                    </dependency>
                    <dependency>
                        <groupId>org.scalatest</groupId>
                        <artifactId>scalatest_${scala.compat.version}</artifactId>
                        <version>2.2.4</version>
                        <scope>test</scope>
                    </dependency>
                    <dependency>
                        <groupId>org.apache.hive</groupId>
                        <artifactId>hive-jdbc</artifactId>
                        <version>1.2.1000.2.6.0.3-8</version>
                    </dependency>


                    <!-- <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.10</artifactId>
                <version>2.1.0</version>
            </dependency>

                    <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-sql_2.10</artifactId>
                <version>2.1.0</version>
            </dependency> -->


                    <dependency>
                        <groupId>org.apache.spark</groupId>
                        <artifactId>spark-core_2.10</artifactId>
                        <version>1.6.3</version>
                    </dependency>
                    <dependency>
                        <groupId>org.apache.spark</groupId>
                        <artifactId>spark-sql_2.10</artifactId>
                        <version>1.6.3</version>
                    </dependency>
                    <dependency>
                        <groupId>org.apache.spark</groupId>
                        <artifactId>spark-hive_2.10</artifactId>
                        <version>1.6.3</version>
                    </dependency>

                    <dependency>
                        <groupId>com.databricks</groupId>
                        <artifactId>spark-csv_2.10</artifactId>
                        <version>1.5.0</version>
                    </dependency>

                </dependencies>

                <build>
                    <sourceDirectory>src/main/scala</sourceDirectory>
                    <testSourceDirectory>src/test/scala</testSourceDirectory>
                    <plugins>
                        <plugin>
                            <!-- see http://davidb.github.com/scala-maven-plugin -->
                            <groupId>net.alchim31.maven</groupId>
                            <artifactId>scala-maven-plugin</artifactId>
                            <version>3.2.0</version>
                            <executions>
                                <execution>
                                    <goals>
                                        <goal>compile</goal>
                                        <goal>testCompile</goal>
                                    </goals>
                                    <configuration>
                                        <args>
                                            <arg>-make:transitive</arg>
                                            <arg>-dependencyfile</arg>
                                            <arg>${project.build.directory}/.scala_dependencies</arg>
                                        </args>
                                    </configuration>
                                </execution>
                            </executions>
                        </plugin>
                        <plugin>
                            <groupId>org.apache.maven.plugins</groupId>
                            <artifactId>maven-surefire-plugin</artifactId>
                            <version>2.18.1</version>
                            <configuration>
                                <useFile>false</useFile>
                                <disableXmlReport>true</disableXmlReport>
                                <!-- If you have classpath issue like NoDefClassError,... -->
                                <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                                <includes>
                                    <include>**/*Test.*</include>
                                    <include>**/*Suite.*</include>
                                </includes>
                            </configuration>
                        </plugin>
                        <plugin>
                                            <groupId>org.apache.maven.plugins</groupId>
                                            <artifactId>maven-jar-plugin</artifactId>
                                            <version>2.4</version>
                                          <configuration>
                                                <skipIfEmpty>true</skipIfEmpty>
                                                </configuration>
                                     <executions>
                                     <execution>
                              <goals>
                              <goal>jar</goal>
                        </goals>
                        </execution>
                        </executions>
                        </plugin>

                        <plugin>
                            <groupId>org.apache.maven.plugins</groupId>
                            <artifactId>maven-assembly-plugin</artifactId>
                            <version>3.0.0</version>
                            <configuration>
                                <descriptorRefs>
                                    <descriptorRef>jar-with-dependencies</descriptorRef>
                                </descriptorRefs>
                                <archive>
                                  <manifest>
                                    <mainClass>com.esi.spark.storedprocedure.Test_jdbc_nospark</mainClass>
                                  </manifest>
                                </archive>
                            </configuration>
                            <executions>
                              <execution>
                                <id>make-assembly</id>
                                <phase>package</phase>
                                <goals>
                                    <goal>single</goal>
                                </goals>
                              </execution>
                            </executions>
                        </plugin>

                        <plugin>
                          <artifactId>maven-clean-plugin</artifactId>
                          <version>3.0.0</version>
                          <configuration>
                            <skip>true</skip>
                          </configuration>
                        </plugin>

                    </plugins>
                </build>
            </project>

我尝试指定

1 - 用于在pom.xml中打包的“jar”。

2 - 将maven目标改为 -

“安装”

“干净安装”

“编译包安装”

但是上面的尝试并没有帮助摆脱这个消息,而创建的jar也毫无用处。

当我尝试执行spark submit命令时 -

            spark-submit  --driver-java-options -Djava.io.tmpdir=/home/EH2524/tmp --conf spark.local.dir=/home/EH2524/tmp --driver-memory 2G --executor-memory 2G --total-executor-cores 1 --num-executors 10 --executor-cores 10 --class com.esi.spark.storedprocedure.Test_jdbc_nospark  --master yarn  /home/EH2524/PROJECT1-0.0.1-20171124.213717-1-jar-with-dependencies.jar
            Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
            Spark1 will be picked by default
            java.lang.ClassNotFoundException: com.esi.spark.storedprocedure.Test_jdbc_nospark
                    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
                    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
                    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
                    at java.lang.Class.forName0(Native Method)
                    at java.lang.Class.forName(Class.java:348)
                    at org.apache.spark.util.Utils$.classForName(Utils.scala:175)
                    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:703)
                    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
                    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
                    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
                    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

这里,Test_jdbc_nospark是一个scala对象。

2 个答案:

答案 0 :(得分:0)

我不确定,但您的maven-jar-plugin配置看起来很可疑。通常,执行将指定阶段,如

  <execution>
    <phase>package</phase>
    <goals>
      <goal>jar</goal>
    </goals>

(来自this example)。也许省略这会导致你的默认jar不被构建?当然,错误消息听起来像你的默认jar没有被构建,但你实际上并没有这样说。

答案 1 :(得分:0)

此消息是因为maven-jar-plugin具有as true。一旦我删除了这个,构建就不会给出消息&#34;没有要安装的主要工件,而是安装附加工件&#34;

由于pom.xml中的路径不正确而导致空jar被创建

Intitally -

    <build>
      <sourceDirectory>src/main/scala</sourceDirectory>

当jenkins在git中构建代码时,pom就在项目文件夹中。

<build>
    <sourceDirectory>folder_name_in_git/src/main/scala</sourceDirectory>