为什么sbt报告Spark 1.3.0-SNAPSHOT罐子无法解决的依赖?

时间:2014-12-28 16:58:14

标签: scala sbt apache-spark

我的sbt文件包含以下内容

name := "Simple Project" 
version := "1.3.0-SNAPSHOT" 
scalaVersion := "2.10.4" 
libraryDependencies += "org.apache.spark" % "spark-core" % "1.3.0-SNAPSHOT" 

sbt package运行项目时出现以下错误:

[info] Set current project to Simple Project (in build file:/home/roott/SparkProjects/checkProject/) 
[info] Updating {file:/home/roott/SparkProjects/checkProject/}default-9d4332... 
[info] Resolving org.scala-lang#scala-library;2.10.4 ... 
[info] Resolving org.apache.spark#spark-core;1.3.0-SNAPSHOT ... 
[warn]  module not found: org.apache.spark#spark-core;1.3.0-SNAPSHOT 
[warn] ==== local: tried 
[warn]   /home/roott/.ivy2/local/org.apache.spark/spark-core/1.3.0-SNAPSHOT/ivys/ivy.xml 
[warn] ==== public: tried 
[warn]   http://repo1.maven.org/maven2/org/apache/spark/spark-core/1.3.0-SNAPSHOT/spark-core-1.3.0-SNAPSHOT.pom
[warn]  :::::::::::::::::::::::::::::::::::::::::::::: 
[warn]  ::          UNRESOLVED DEPENDENCIES         :: 
[warn]  :::::::::::::::::::::::::::::::::::::::::::::: 
[warn]  :: org.apache.spark#spark-core;1.3.0-SNAPSHOT: not found 
[warn]  :::::::::::::::::::::::::::::::::::::::::::::: 
[error] {file:/home/roott/SparkProjects/checkProject/}default-9d4332/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core;1.3.0-SNAPSHOT: not found 
[error] Total time: 2 s, completed 28-Dec-2014 16:49:50 

错误是什么意思以及如何解决?

1 个答案:

答案 0 :(得分:3)

我认为Spark项目不会在任何地方发布1.3.0-SNAPSHOT二进制文件,因此您应该在本地构建Spark并在项目中引用它。

当您使用Apache Maven作为构建工具在Building Spark之后构建Spark时,您需要使用resolvers += Resolver.mavenLocal将本地Maven存储库添加到您的sbt构建中。阅读sbt。

官方文档中的Resolvers