从2.0版开始,Apache Spark捆绑了一个充满.jar文件的文件夹“jars”。显然,Maven会在发布时下载所有这些罐子:
mvn -e package
因为要提交带有
的申请spark-submit --class DataFetch target/DataFetch-1.0-SNAPSHOT.jar
需要 DataFetch-1.0-SNAPSHOT.jar 。
所以,第一个问题很简单,我怎样才能利用这些现有的罐子?第二个问题是相关的,儿子我第一次尝试用Maven下载罐子,我有以下输出:
[INFO] Error stacktraces are turned on.
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building "DataFetch" 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @DataFetch ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /root/sparkTests/scalaScripts/DataFetch/src/main/resources
[INFO]
[INFO] --- maven-compiler-plugin:3.0:compile (default-compile) @ DataFetch - --
[INFO] No sources to compile
[INFO]
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ DataFetch ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory/root/sparkTests/scalaScripts/DataFetch/src/test/resources
[INFO]
[INFO] --- maven-compiler-plugin:3.0:testCompile (default-testCompile) @ DataFetch ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-surefire-plugin:2.10:test (default-test) @ DataFetch ---
[INFO] No tests to run.
[INFO] Surefire report directory: /root/sparkTests/scalaScripts/DataFetch/target/surefire-reports
-------------------------------------------------------
T E S T S
-------------------------------------------------------
Results :
Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO]
[INFO] --- maven-jar-plugin:2.3.2:jar (default-jar) @ DataFetch---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4.294s
[INFO] Finished at: Wed Sep 28 17:41:29 PYT 2016
[INFO] Final Memory: 14M/71M
[INFO] ------------------------------------------------------------------------
这是我的pom.xml文件
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.spark.pg</groupId>
<artifactId>DataFetch</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>"DataFetch"</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.0</version>
</plugin>
</plugins>
</build>
</project>
如果需要更多信息,请随时提出要求。
答案 0 :(得分:1)
我不确定我是否理解你的问题,但我试着回答。
基于 Spark捆绑您的应用程序的依赖性文档:
创建装配罐时,请按提供的方式列出Spark和Hadoop 依赖;这些不需要捆绑,因为它们是由提供的 运行时的集群管理器。
您可以在maven pom.xml文件中将范围设置为提供
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
<!-- add this scope -->
<scope>provided</scope>
</dependency>
我注意到的第二个想法是maven build会创建空的JAR。
[WARNING] JAR will be empty - no content was marked for inclusion!
如果您有任何其他依赖项,则应将这些依赖项打包到最终的jar存档文件中。
您可以在pom.xml中执行以下操作,然后运行 mvn package :
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.6</version>
<configuration>
<!-- package with project dependencies -->
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<archive>
<manifest>
<mainClass>YOUR_MAIN_CLASS</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
Maven日志应该使用构建jar打印行:
[INFO] --- maven-assembly-plugin:2.4.1:single (make-assembly) @ dateUtils ---
[INFO] Building jar: path/target/APPLICATION_NAME-jar-with-dependencies.jar
在目标文件夹中的maven打包阶段后,您应该看到 DataFetch-1.0-SNAPSHOTjar-with-dependencies.jar ,您可以使用 spark-submit <来汇总此jar / p>