尝试编译scala代码时如何解决“未解决的依赖性”和“下载失败”

时间:2019-04-30 01:08:00

标签: sbt scala-2.10 sbt-0.13 apache-spark-1.5.2

我正在尝试使用sbt构建jar文件,但是当我运行“ package”命令时,出现诸如“下载失败”和“未解决的依赖项”之类的错误。

我拥有的build.sbt文件是

name := "SparkPi"

version := "0.13"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.2"

我拥有的Scala代码是

import scala.math.random

object LocalPi {
 def main(args: Array[String]) {
  var count = 0
  for (i <- 1 to 100000) {
   val x = random * 2 - 1
   val y = random * 2 - 1
   if (x*x + y*y <= 1) count += 1
  }
  println(s"Pi is roughly ${4 * count / 100000.0}")
 }
}

我一直遇到的错误是

::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.hadoop#hadoop-mapreduce-client-app;2.2.0: not found
[warn]  :: org.apache.hadoop#hadoop-yarn-api;2.2.0: not found
[warn]  :: org.apache.hadoop#hadoop-mapreduce-client-core;2.2.0: not found
[warn]  :: org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.2.0: not 
found
[warn]  :: asm#asm;3.1: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::              FAILED DOWNLOADS            ::
[warn]  :: ^ see resolution messages for details  ^ ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.avro#avro;1.7.7!avro.jar
[warn]  :: commons-codec#commons-codec;1.4!commons-codec.jar

任何人都可以帮助我了解问题所在以及如何解决该问题。

0 个答案:

没有答案