无法在Scala应用程序中创建Spark SQLContext

时间:2016-08-17 14:06:07

标签: scala apache-spark sbt

我无法创建SQLContext。我的代码:

XXXXX

我的sbt

val sc = new SparkContext("local[*]", "myApp")
val sqlContext = new SQLContext(sc)

错误堆栈

import AssemblyKeys._
assemblySettings

name := "Ideas"

 version := "1.0"

scalaVersion := "2.10.5"

libraryDependencies ++= Seq(
 "org.scalatest" %% "scalatest" % "1.9.1" % "test",
 "junit" % "junit" % "4.12" % "test",
 "com.novocode" % "junit-interface" % "0.11" % "test->default",
 "org.mockito" % "mockito-core" % "1.9.5",
 "org.apache.spark" %% "spark-core" % "1.4.1",
 "org.apache.spark" %% "spark-sql" % "1.4.1",
 "org.apache.spark" %% "spark-hive" % "1.4.1",
 "org.apache.spark" % "spark-catalyst_2.10" % "1.4.1" 

)

 libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"

resolvers += Resolver.mavenLocal

testOptions += Tests.Argument(TestFrameworks.JUnit, "-q", "-v")
EclipseKeys.withSource := true

parallelExecution in Test := false


enablePlugins(JavaAppPackaging)

1 个答案:

答案 0 :(得分:0)

我通过检查cdh spark平台集群的环境解决了此问题,原因是我们的工程师让我们的开发人员将spark-sql.jar上传到cdh spark lib jars文件夹【/ app / opt / cloudera / cloudera / parcels /CDH-5.12.0-1.cdh5.12.0.p0.29/lib/spark/appjars/spark-sql_2.10-1.6.0.jar],实际上,由于环境已包含该文件,因此不需要这样做,因此,MV或删除它只能解决