运行spark hello world程序时出现以下错误。
[info] Updating {file:/C:/Users/user1/IdeaProjects/sqlServer/}sqlserver... [info] Resolving org.apache.spark#spark-core_2.12;2.1.1 ... [warn] module not found: org.apache.spark#spark-core_2.12;2.1.1 [warn] ==== local: tried [warn] C:\Users\user1\.ivy2\local\org.apache.spark\spark-core_2.12\2.1.1\ivys\ivy.xml [warn] ==== public: tried [warn] https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.12/2.1.1/spark-core_2.12-2.1.1.pom [warn] ==== local-preloaded-ivy: tried [warn] C:\Users\user1\.sbt\preloaded\org.apache.spark\spark-core_2.12\2.1.1\ivys\ivy.xml [warn] ==== local-preloaded: tried [warn] file:/C:/Users/user1/.sbt/preloaded/org/apache/spark/spark-core_2.12/2.1.1/spark-core_2.12-2.1.1.pom [info] Resolving jline#jline;2.14.3 ... [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: UNRESOLVED DEPENDENCIES :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: org.apache.spark#spark-core_2.12;2.1.1: not found [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] [warn] Note: Unresolved dependencies path: [warn] org.apache.spark:spark-core_2.12:2.1.1 (C:\Users\user1\IdeaProjects\sqlServer\build.sbt#L7-8) [warn] +- mpa:mpa_2.12:1.0 [trace] Stack trace suppressed: run last *:update for the full output. [error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.1.1: not found [error] Total time: 1 s, completed May 9, 2017 11:05:44 AM
这是 build.sbt ,
name := "Mpa"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1"
我的Spark webcome消息。
Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.1.1 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_111) Type in expressions to have them evaluated. Type :help for more information.
更新
我将built.sbt
更改为
name := "Mpa"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core_2.11" % "2.1.0"
但仍然有
[info] Updating {file:/C:/Users/user1/IdeaProjects/sqlServer/}sqlserver... [info] Resolving org.apache.spark#spark-core_2.11_2.11;2.1.0 ... [warn] module not found: org.apache.spark#spark-core_2.11_2.11;2.1.0 [warn] ==== local: tried [warn] C:\Users\user1\.ivy2\local\org.apache.spark\spark-core_2.11_2.11\2.1.0\ivys\ivy.xml [warn] ==== public: tried [warn] https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11_2.11/2.1.0/spark-core_2.11_2.11-2.1.0.pom [warn] ==== local-preloaded-ivy: tried [warn] C:\Users\user1\.sbt\preloaded\org.apache.spark\spark-core_2.11_2.11\2.1.0\ivys\ivy.xml [warn] ==== local-preloaded: tried [warn] file:/C:/Users/user1/.sbt/preloaded/org/apache/spark/spark-core_2.11_2.11/2.1.0/spark-core_2.11_2.11-2.1.0.pom [info] Resolving jline#jline;2.12.1 ... [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: UNRESOLVED DEPENDENCIES :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: org.apache.spark#spark-core_2.11_2.11;2.1.0: not found [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] [warn] Note: Unresolved dependencies path: [warn] org.apache.spark:spark-core_2.11_2.11:2.1.0 (C:\Users\user1\IdeaProjects\sqlServer\build.sbt#L7-8) [warn] +- mpa:mpa_2.11:1.0 [trace] Stack trace suppressed: run last *:update for the full output. [error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11_2.11;2.1.0: not found [error] Total time: 1 s, completed May 9, 2017 1:01:01 PM
答案 0 :(得分:12)
您在built.sbt文件中出错,必须将%%
更改为%
:
name := "Mpa"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" % "spark-core" % "2.1.1"
%%
要求Sbt将当前的scala版本添加到工件
您spark-core_2.11
%
可以解决问题。
// https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"
希望这有帮助!
答案 1 :(得分:5)
我得到了同样的错误。
build.sbt
name := "Simple Project"
version := "1.0"
scalaVersion := "2.12.3"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"
将scalaVersion更改为2.11.8或更低。它有效。
答案 2 :(得分:1)
我得到了同样的错误并通过以下步骤解决了这个问题。基本上,文件名与sbt配置不匹配
- 检查$ SPARK_HOME / jars中的spark core jar的文件名(spark-core_ 2.11 -2.1.1.jar)。
- 安装scala 2.11 .11。
- 将build.sbt编辑为 scalaVersion:=“ 2.11 .11”。
答案 3 :(得分:0)
这对我有用。示例build.sbt
name := "testproj"
version := "0.1"
scalaVersion := "2.11.9"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"
答案 4 :(得分:0)
SparkSession 在 spark-sql 库中可用。 您必须将 spark-sql 依赖项添加到构建中。
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.1"
答案 5 :(得分:0)
适用于2.11.12的版本对。
scalaVersion := "2.11.12"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.2",
"org.apache.spark" %% "spark-sql" % "2.3.2"
)