我在IntelliJ IDE中创建了一个简单的SBT项目,在build.sbt
中具有以下库依赖项:
import _root_.sbt.Keys._
name := "untitled"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.5.1",
"org.apache.spark" %% "spark-sql" % "1.5.1" ,
"org.apache.spark" %% "spark-mllib" % "1.5.1")
目标是导入Spark的Spark和MLLIB,然后按照here的说明创建Scala对象。
但是,导入时会出现以下错误:
SBT project import [warn] Multiple dependencies with the same organization/name but different versions. To avoid conflict, pick one version: [warn] *
org.scala-lang:scala-compiler:(2.11.0,2.11.7)[warn] * org.apache.commons:commons-lang3:(3.3.2,3.0)[warn] * jline:jline:(0.9.94,2.12.1)[警告] * org.scala-lang.modules:scala-parser-combinators_2.11:(1.0.1,1.0.4) [warn] * org.scala-lang.modules:scala-xml_2.11:(1.0.1,1.0.4)[warn] * org.slf4j:slf4j-api:(1.7.10,1.7.2)[warn] [FAILED] net.sourceforge.f2j#arpack_combined_all 0.1 arpack_combined_all.jar(SRC)! (0ms)[警告] ====本地:试过[警告] C:\ Users \用户Cezar.ivy2 \本地\ net.sourceforge.f2j \ arpack_combined_all \ 0.1 \ SRCS \ arpack_combined_all-sources.jar [警告] ====公众:试过[警告] https://repo1.maven.org/maven2/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1-sources.jar [警告] [失败] javax.xml.bind#jsr173_api; 1.0!jsr173_api.jar(doc): (0ms)[警告] ====本地:试过[警告] C:\ Users \用户Cezar.ivy2 \本地\ javax.xml.bind中\ jsr173_api \ 1.0 \文档\ jsr173_api-javadoc.jar [警告] ====公众:试过[警告] https://repo1.maven.org/maven2/javax/xml/bind/jsr173_api/1.0/jsr173_api-1.0-javadoc.jar [warn] [FAILED] javax.xml.bind#jsr173_api; 1.0!jsr173_api.jar(src): (0ms)[警告] ====本地:试过[警告] C:\ Users \用户Cezar.ivy2 \本地\ javax.xml.bind中\ jsr173_api \ 1.0 \ SRCS \ jsr173_api-sources.jar [警告] ====公众:试过[警告] https://repo1.maven.org/maven2/javax/xml/bind/jsr173_api/1.0/jsr173_api-1.0-sources.jar [warn] ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::警告] ::失败 DOWNLOADS :: [warn] :: ^有关详细信息,请参阅解析消息^ :: [warn] ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::警告] :: net.sourceforge.f2j#arpack_combined_all;!0.1 arpack_combined_all.jar(SRC) [warn] :: javax.xml.bind#jsr173_api; 1.0!jsr173_api.jar(doc)[warn] :: javax.xml.bind #jsr173_api; 1.0!jsr173_api.jar(src)[warn] ::::::::::::::::::::::::::::::::::::::::::::::
答案 0 :(得分:1)
Spark不适用于Scala 2.11。它使用Scala 2.10,因此您需要使用兼容的Scala版本(请参阅http://spark.apache.org/docs/latest/)。
或者,您可以像@eliasah在评论中提到的那样自己构建Spark。有关如何构建Spark的说明,请访问http://spark.apache.org/docs/latest/building-spark.html