mongodb spark connector问题

时间:2017-12-26 13:31:06

标签: mongodb apache-spark apache-spark-sql

我是mongodb的新手。我试图从mongodb中提取数据作为Spark Dataframe。

我正在使用MongoDB Connector for Spark
链接:https://docs.mongodb.com/spark-connector/master/

我正在关注此网站的步骤:https://docs.mongodb.com/spark-connector/master/scala/datasets-and-sql/
程序成功编译但出现以下运行时错误:

Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/ConnectionString
at com.mongodb.spark.config.MongoCompanionConfig$$anonfun$4.apply(MongoCompanionConfig.scala:278)
at com.mongodb.spark.config.MongoCompanionConfig$$anonfun$4.apply(MongoCompanionConfig.scala:278)
at scala.util.Try$.apply(Try.scala:192)
at com.mongodb.spark.config.MongoCompanionConfig$class.connectionString(MongoCompanionConfig.scala:278)
at com.mongodb.spark.config.ReadConfig$.connectionString(ReadConfig.scala:39)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:51)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:39)
at com.mongodb.spark.config.MongoCompanionConfig$class.apply(MongoCompanionConfig.scala:124)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:39)
at com.mongodb.spark.config.MongoCompanionConfig$class.apply(MongoCompanionConfig.scala:113)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:39)
at com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:67)
at com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:50)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:307)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
at ScalaDemo.HelloWorld$.main(HelloWorld.scala:25)
at ScalaDemo.HelloWorld.main(HelloWorld.scala)
Caused by: java.lang.ClassNotFoundException: com.mongodb.ConnectionString
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 18 more

以下是maven片段

<dependencies>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>2.2.1</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.2.1</version>
</dependency>
<dependency>
    <groupId>org.mongodb.spark</groupId>
    <artifactId>mongo-spark-connector_2.11</artifactId>
    <version>2.2.1</version>
</dependency>

代码:     包ScalaDemo

import com.mongodb.spark._
import com.mongodb.spark.config._

object HelloWorld {
def main(args: Array[String]): Unit = {
import org.apache.spark.sql.SparkSession

val spark = SparkSession.builder()
  .master("local")
  .appName("MongoSparkConnectorIntro")
  .config("spark.mongodb.input.uri", "mongodb://localhost/admin.partnerCompanies")
  .config("spark.mongodb.output.uri", "mongodb://localhost/admin.partnerCompanies")
  .getOrCreate()
 val df1= spark.read.format("com.mongodb.spark.sql").load()
 df1.show()
  }
}

请帮忙

1 个答案:

答案 0 :(得分:0)

看起来与火花无关,你的例外是

Exception in thread "main" java.lang.NoClassDefFoundError:com/mongodb/ConnectionString

意味着它无法找到连接到mongo的类。 尝试添加mongo UberJar

<dependencies>
    <dependency>
        <groupId>org.mongodb</groupId>
        <artifactId>mongo-java-driver</artifactId>
        <version>3.0.4</version>
    </dependency>
</dependencies>