我是scala用spark编程的新手。我正在尝试运行以下程序:
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
object TestMain {
def main(args: Array[String]) {
val logFile = "spark-1.4.1/README.md" // Should be some file on your system
val conf = new SparkConf().setAppName("Simple Application")
.setMaster("spark://myhost:7077")
val sc = new SparkContext(conf)
val logData = sc.textFile(logFile, 2).cache()
val numAs = logData.filter(line => line.contains("a")).count()
val numBs = logData.filter(line => line.contains("b")).count()
println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
}
}
我的build.sbt如下:
name := "TestSpark"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.1" % "provided"
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.4.0" % "provided"
我已尝试过在各种论坛上建议的所有内容,但仍然无法消除错误
java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
有什么建议吗?有什么我做错了吗?