Spark sbt scala构建错误 - 从常春藤丢失罐子

时间:2017-05-21 17:02:35

标签: scala apache-spark sbt

我正在尝试从eclipse工具执行spark项目。 在build.sbt中我添加了下面的

name := "simple-spark-scala"
version := "1.0"
scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.6.2"

当我导入这个项目时,我收到错误 -

项目缺少库大约100个这样的错误

Description Resource    Path    Location    Type
Project 'simple-spark' is missing required library: '/root/.ivy2/cache/aopalliance/aopalliance/jars/aopalliance-1.0.jar'    simple-spark        Build path  Build Path Problem

但是我能够在丢失的jar文件中看到所提到的路径下的所有jar

知道如何解决?

2 个答案:

答案 0 :(得分:0)

built.sbt文件中添加依赖项解析程序,如下所示

resolvers += "MavenRepository" at "https://mvnrepository.com/"

答案 1 :(得分:0)

因为,你不要链接Spark Packages Repo。您可以在下面看到我的built.sbt

name := "spark"

version := "1.0"

scalaVersion := "2.11.8"

resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "2.1.0",
  "org.apache.spark" % "spark-sql_2.11" % "2.1.0",
  "org.apache.spark" % "spark-graphx_2.11" % "2.1.0",
  "org.apache.spark" % "spark-mllib_2.11" % "2.1.0",
  "neo4j-contrib" % "neo4j-spark-connector" % "2.0.0-M2"
)