新手火花和斯卡拉。我正试图通过intellij-idea执行一个非常简单的火花程序。它所做的只是:
它工作正常,但现在却抛出了错误:
org.bson.codecs.configuration.CodecConfigurationException:找不到 类java.lang.Class的编解码器。
这是我的代码:
import org.apache.spark.{SparkConf, SparkContext}
import com.mongodb.spark._
import com.mongodb.spark.rdd.MongoRDD
import org.bson.Document
import com.mongodb.spark.config._
import org.apache.spark.sql.SQLContext
import com.mongodb.spark.sql._
import scala.reflect.runtime.universe._
object Analytics1 {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Analytics1").setMaster("local").
set("spark.mongodb.input.uri","mongodb://192.168.56.1:27017/events.entEvent")
.set("spark.mongodb.output.uri", "mongodb://192.168.56.1:27017/events..entResult")
val sc = new SparkContext(conf)
val rdd = sc.loadFromMongoDB()
println(rdd.first())
sc.stop()
}
}
这是我的.sbt。如果我使用最新版本的spark,则会抛出此错误
线程“main”中的异常java.lang.NoClassDefFoundError: 组织/阿帕奇/火花/ SQL /数据帧
所以我使用的是1.6.1,,直到几天才工作正常,但现在它正在抛出
java.lang.Class中
错误。有人请帮忙,这样我才能动起来。由于这是非常基本的,我希望有人会提出一些建议并让我畅通无阻。
感谢。
name := "Simple Project"
version := "1.0"
scalaVersion := "2.11.7"
// libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
//libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "1.6.1"
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.6.1"
//libraryDependencies += "org.mongodb.spark" % "mongo-spark-connector_2.10" % "1.1.0"
libraryDependencies += "org.mongodb.spark" %% "mongo-spark-connector" % "1.1.0"
libraryDependencies += "org.mongodb.scala" %% "mongo-scala-driver" % "1.2.1"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.6.1"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
resolvers += "snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/ "
resolvers += "releases" at "https://oss.sonatype.org/content/repositories/releases/"
答案 0 :(得分:1)
libraryDependencies + =“org.mongodb.spark”%“mongo-spark-connector_2.10”%“1.1.0”
您正在为Scala版本2.10加载MongoDB Connector for Spark。虽然,您的项目使用的是Scala 2.11.7版,包括mongo-scala-driver
。
将上面的行换成:
libraryDependencies += "org.mongodb.spark" % "mongo-spark-connector_2.11" % "1.1.0"
或者,通过指定double %
:
libraryDependencies += "org.mongodb.spark" %% "mongo-spark-connector" % "1.1.0"