Maven无法解决依赖关系问题

时间:2014-10-23 15:43:15

标签: maven hadoop apache-spark

我尝试构建一个简单的java程序:用于spark-1.1.0的JavaWordCount。

I get this error: Building JavaWordCount 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1.279 s
[INFO] Finished at: 2014-10-23T11:28:30-04:00
[INFO] Final Memory: 9M/156M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project JavaWordCount: Could not resolve dependencies for     project spark.examples:JavaWordCount:jar:1.0-SNAPSHOT: Failure to find org.apache.spark:spark-assembly_2.10:jar:1.1.0 in http://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException

以下是我对pom.xml的依赖

<dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <scope>test</scope>
     </dependency>
    <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-assembly_2.10</artifactId>
    <version>1.1.0</version>
   </dependency>
   <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-examples_2.10</artifactId>
    <version>1.1.0</version>
    </dependency>
 <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>1.1.0</version>
 </dependency>

它包括火花装配。

任何想法都会非常感激。

谢谢!

1 个答案:

答案 0 :(得分:2)

问题在于依赖:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-assembly_2.10</artifactId>
  <version>1.1.0</version>
</dependency>

不是一个罐子pom file only which means你不能像这样定义它。您可以在错误消息中看到它:

Failure to find org.apache.spark:spark-assembly_2.10:jar:1.1.0

表明Maven将尝试下载jar文件。你必须这样定义它:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-assembly_2.10</artifactId>
  <version>1.1.0</version>
  <type>pom</type>
</dependency>

但我不确定这是否能解决所有问题。如果这是正确的路径,您应该仔细查看文档。

<强>更新 您还可以use that as BOM via:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-assembly_2.10</artifactId>
  <version>1.1.0</version>
  <type>pom</type>
  <scope>import</scope>
</dependency>