以下代码会产生Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.implicits()Lorg/apache/spark/sql/SQLContext$implicits$
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.log4j.Logger
import org.apache.log4j.Level
object Small {
def main(args: Array[String]) {
Logger.getLogger("org.apache.spark").setLevel(Level.WARN)
Logger.getLogger("org.eclipse.jetty.server").setLevel(Level.OFF)
// set up environment
val conf = new SparkConf()
.setMaster("local[1]")
.setAppName("Small")
.set("spark.executor.memory", "2g")
val sc = new SparkContext(conf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._
val df = sc.parallelize(Array((1,30),(2,10),(3,20),(1,10), (2,30))).toDF("books","readers")
df.show
项目是用SBT建立的:
name := "Small"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.3.1"
我使用提交脚本运行它:
#!/bin/sh
/home/test/usr/spark-1.1.0/bin/spark-submit \
--class Small \
--master local[*] \
--driver-memory 2g \
/home/test/wks/Pairs/target/scala-2.10/small_2.10-1.0.jar
有什么想法吗?
SBT编译并打包此代码。然而,当我尝试使用sbt run
运行此代码时,我得到另一个例外:[error] (run-main-0) scala.reflect.internal.MissingRequirementError: class org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with java.net.URLClassLoader@210ce673 of type class java.net.URLClassLoader with classpath [file:/home/test/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.10.4.jar, ...
启动scala程序时,有没有办法让sbt run
包含所有依赖项?
答案 0 :(得分:2)
" /home/test/usr/spark-1.1.0/bin/spark-submit" 您的编译版本是1.3.1,它与运行时版本不同。 版本1.1.0的SqlContext没有定义"对象含义"。