spark sbt编译错误库依赖项

时间:2015-10-15 08:36:00

标签: scala hadoop apache-spark sbt-assembly

1.2.0-bin-hadoop2.4我的Scala版本为2.11.7。我收到错误所以我不能使用sbt。

~/sparksample$ sbt

Starting sbt: invoke with -help for other options [info] Set current project to Spark Sample (in build file:/home/beyhan/sparksample/)

> sbt compile

[info] Updating {file:/home/beyhan/sparksample/}default-f390c8... [info] Resolving org.scala-lang#scala-library;2.11.7 ... [info] Resolving org.apache.spark#spark-core_2.11.7;1.2.0 ... [warn] module not found: org.apache.spark#spark-core_2.11.7;1.2.0 [warn] ==== local: tried [warn] /home/beyhan/.ivy2/local/org.apache.spark/spark-core_2.11.7/1.2.0/ivys/ivy.xml [warn] ==== public: tried [warn] http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11.7/1.2.0/spark-core_2.11.7-1.2.0.pom [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: UNRESOLVED DEPENDENCIES :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: org.apache.spark#spark-core_2.11.7;1.2.0: not found [warn] :::::::::::::::::::::::::::::::::::::::::::::: [error] {file:/home/beyhan/sparksample/}default-f390c8/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11.7;1.2.0: not found [error] Total time: 2 s, completed Oct 15, 2015 11:30:47 AM

有什么建议吗?感谢

3 个答案:

答案 0 :(得分:3)

没有spark-core_2.11.7个jar文件。您必须摆脱spark依赖关系中的维护版本号.7,因为spark-core_2.11存在。所有版本为2.11的Scala版本都应兼容。

更新

最小的sbt文件可能看起来像

name := "Simple Project"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"

答案 1 :(得分:1)

正如@Till Rohrmann所建议的那样,spark-core_2.11.7没有这样的东西,你的build.sbt似乎引用了那个库。

我建议您编辑文件/home/beyhan/sparksample/build.sbt并删除对该库的引用。

正确的参考是:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.2.0"

请记住,spark-core不仅没有任何版本2.11.7,还有其他可能正在使用的火花库。

答案 2 :(得分:0)

[info]更新{file:/ home / beyhan / sparksample /} default-f390c8 ... [info]解析org.scala-lang #scala-library; 2.11.7 ... [info]解析组织。 apache.spark#spark-core_2.11.7; 1.2.0 ... [warn]模块未找到:org.apache.spark#spark-core_2.11.7; 1.2.0 [warn] ==== local:tries [warn] ] /home/beyhan/.ivy2/local/org.apache.spark/spark-core_2.11.7/1.2.0/ivys/ivy.xml [warn] ==== public:试过[警告]