Scala-Spark中maven构建期间出错

时间:2015-11-30 03:34:42

标签: eclipse scala maven apache-spark

我正在尝试使用Eclipse和Maven创建简单的spark应用程序。我在maven建造时遇到以下错误 无法执行目标net.alchim31.maven:scala-maven-plugin:3.2.0:项目XXXX上的编译(默认):wrap:org.apache.commons.exec.ExecuteException:进程退出并显示错误:1(退出值:1) - > [帮助1]

我正在使用以下POM.xml进行maven build

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>SparkScalaMM</groupId>
  <artifactId>MMSpark</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <name>${project.artifactId}</name>
  <description>My wonderfull scala app</description>
  <inceptionYear>2015</inceptionYear>
      <licenses>
        <license>
          <name>My License</name>
          <url>http://....</url>
          <distribution>repo</distribution>
        </license>
      </licenses>

  <properties>  
     <maven.compiler.source>1.6</maven.compiler.source>
     <maven.compiler.target>1.6</maven.compiler.target>
     <encoding>UTF-8</encoding>
     <scala.version>2.11.5</scala.version>      
    <scala.compat.version>2.11</scala.compat.version>   
  </properties>

 <dependencies>
    <dependency>
       <groupId>org.scala-lang</groupId>
       <artifactId>scala-library</artifactId>
       <version>${scala.version}</version>
    </dependency>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>4.11</version>
        <scope>test</scope>
   </dependency>
   <dependency>
       <groupId>org.specs2</groupId>
       <artifactId>specs2-core_${scala.compat.version}</artifactId>
       <version>2.4.16</version>
       <scope>test</scope>
   </dependency>
   <dependency>
      <groupId>org.scalatest</groupId>                                  
       <artifactId>
          scalatest_${scala.compat.version}
       </artifactId>
         <version>2.2.4</version>
          <scope>test</scope>
   </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.5.2</version>
    </dependency>
  </dependencies>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>
            <plugins>
               <plugin>
                  <groupId>net.alchim31.maven</groupId>
                  <artifactId>scala-maven-plugin</artifactId>
                  <version>3.2.0</version>
                  <executions>
                    <execution>
                      <goals>
                        <goal>compile</goal>
                        <goal>testCompile</goal>
                     </goals>
                  <configuration>
                     <args>
                       <arg>-make:transitive</arg>
                       <arg>-dependencyfile</arg>   
                       <arg>  ${project.build.directory}/.scala_dependencies
                      </arg>
                     </args>
                     </configuration>                       
                   </execution>
                </executions>
           </plugin>

            <plugin>                                        
               <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
               <version>2.18.1</version>
             <configuration>
               <useFile>false</useFile>                                       
               <disableXmlReport>true</disableXmlReport>
               <includes>
                  <include>**/*Test.*</include>
                  <include>**/*Suite.*</include>
               </includes>
            </configuration>
          </plugin>
      </plugins>
   </build>
</project>

4 个答案:

答案 0 :(得分:2)

请参阅此link

input:checked { background: #333333 !important; }

我们必须首先添加流程资源阶段,然后再添加下一个流程测试资源 然后只有scala-maven-plugin能够正确识别类路径资源。

答案 1 :(得分:1)

使用我在下面提供的内容..并确保在项目构建路径中使用scala库版本2.11.x

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.spark-scala</groupId>
    <artifactId>spark-scala</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>${project.artifactId}</name>
    <description>Spark in Scala</description>
    <inceptionYear>2010</inceptionYear>

    <properties>
        <maven.compiler.source>1.8</maven.compiler.source>
        <maven.compiler.target>1.8</maven.compiler.target>
        <encoding>UTF-8</encoding>
        <scala.tools.version>2.10</scala.tools.version>
        <!-- Put the Scala version of the cluster -->
        <scala.version>2.10.4</scala.version>
    </properties>

    <!-- repository to add org.apache.spark -->
    <repositories>
        <repository>
            <id>cloudera-repo-releases</id>
            <url>https://repository.cloudera.com/artifactory/repo/</url>
        </repository>
    </repositories>

    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.2.1</version>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.scala-tools</groupId>
                <artifactId>maven-scala-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>1.2.1</version>
        </dependency>
    </dependencies>
</project>

答案 2 :(得分:0)

http://freecontent.manning.com/wp-content/uploads/how-to-start-developing-spark-applications-in-eclipse.pdf 包含在Apache Spark中开始编码的绝佳方法。它指定了一个远程原型,我在maven目录中添加了它。现在我用它来实现spark应用程序。

答案 3 :(得分:-1)

首先关闭锌。请参阅文档here

./build/zinc-<version>/bin/zinc -shutdown