Spark未解决的依赖关系hadoop

时间:2015-02-13 15:11:10

标签: sbt apache-spark

我尝试构建self-contained example scala application,但在运行sbt package时,我得到以下内容:

[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.hadoop#hadoop-yarn-common;1.0.4: not found
[warn]  :: org.apache.hadoop#hadoop-yarn-client;1.0.4: not found
[warn]  :: org.apache.hadoop#hadoop-yarn-api;1.0.4: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[error] {file:/home/niko/workspace/Spark/recommender/}default-3ebb80/*:update: sbt.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-yarn-common;1.0.4: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-yarn-client;1.0.4: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-yarn-api;1.0.4: not found

有没有人知道必须配置什么才能成功运行应用程序(如果可能,如果没有安装hadoop)?

谢谢!

4 个答案:

答案 0 :(得分:2)

问题是因为你的sbt无法检索目标文件,repos的URL不可用。

下载最新的sbt版本,并将以下内容添加到~/.sbt/repositories

[repositories]
  local
  sbt-releases-repo: http://repo.typesafe.com/typesafe/ivy-releases/, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext]
  sbt-plugins-repo: http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext]
  maven-central: http://repo1.maven.org/maven2/

答案 1 :(得分:1)

您启用了YARN配置文件,但未设置hadoop.version。默认的Hadoop版本是1.0.4,并且没有这样的YARN。通常,无论如何都要指定hadoop.version

答案 2 :(得分:0)

在尝试构建我的spark程序时遇到了完全相同的问题。我发现我的sbt版本给了我错误。我建议完全删除sbt。然后从这里下载http://www.scala-sbt.org/download.html。下载.tgz。将其解压缩到您的主文件夹。然后将bin目录添加到路径中。

答案 3 :(得分:0)

我通过启用自动导入选项然后重新编译来解决它