MongoDB Spark Connector:mongo-spark无法连接到DB

时间:2017-04-20 15:24:26

标签: java mongodb apache-spark

我尝试使用MongoDB的Spark连接器从MongoDB读取数据。我在启动应用程序时向Spark conf对象提供了数据库和集合详细信息。然后使用以下代码阅读。

// create configuration
SparkSession spark = SparkSession.builder()
  .master("local")
  .appName("MongoSparkConnectorIntro")
  .config("spark.mongodb.input.uri", "mongodb://localhost:27017/Employee.zipcodes")
  .config("spark.mongodb.output.uri", "mongodb://localhost:27017/Employee.test")
  .getOrCreate(); 
// Create a JavaSparkContext using the SparkSession's SparkContext object
JavaSparkContext jsc = new JavaSparkContext(spark.sparkContext());

/*Start Example: Read data from MongoDB************************/

JavaMongoRDD<Document> rdd = MongoSpark.load(jsc);

/*End Example**************************************************/

// Analyze data from MongoDB
System.out.println(rdd.count());
System.out.println(rdd.first().toJson());

但是这无法连接localhost数据库。这显示以下错误。

Exception in thread "main" java.lang.NoSuchMethodError: com.mongodb.spark.config.ReadConfig$.apply(Lorg/apache/spark/SparkConf;Lscala/collection/Map;)Ljava/lang/Object;
 at com.mongodb.spark.MongoSpark$Builder.build(MongoSpark.scala:259)
 at com.mongodb.spark.MongoSpark$.load(MongoSpark.scala:375)
 at com.mongodb.spark.MongoSpark.load(MongoSpark.scala)
 at com.mycompany.app.App2.main(App2.java:35)

我使用以下maven依赖项。

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.11</artifactId>
  <version>2.1.0</version>      
</dependency>
<dependency>
    <groupId>org.mongodb.spark</groupId>
    <artifactId>mongo-spark-connector_2.11</artifactId>
    <version>1.1.0</version>
</dependency>
<dependency>
    <groupId>org.mongodb</groupId>
    <artifactId>bson</artifactId>
    <version>3.2.2</version>
</dependency>   
<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>2.11.7</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>2.2.0</version>
</dependency>

1 个答案:

答案 0 :(得分:1)

您可以尝试将mongo spark连接器版本改进为最新版本

<dependency>
    <groupId>org.mongodb.spark</groupId>
    <artifactId>mongo-spark-connector_2.11</artifactId>
    <version>2.2.1</version>
</dependency>