无法通过Scala中的Spark读取MongoDB辅助节点

时间:2017-11-10 01:14:13

标签: mongodb apache-spark

我写了一个Spark代码,通过Scala从MongoDB读取数据。一些代码示例如下:

val mongoConfig = new Configuration()
mongoConfig.set("mongo.input.uri","mongodb://secondarydb.test.local/testdb.test?readPreference=secondary")
val sparkConf = new SparkConf().setMaster("local[5]")
val sc = new SparkContext(sparkConf)
val documents = sc.newAPIHadoopRDD(mongoConfig,classOf[MongoInputFormat],classOf[Object], classOf[BSONObject])

我确实添加了 readPreference = secondary ,但我仍然遇到以下异常:

Exception in thread "main" com.mongodb.MongoNotPrimaryException:
 The server is not the primary and did not execute the operation

0 个答案:

没有答案