Java mongo-spark-connector-如何将无序批量写入MongoDB?

时间:2018-09-30 18:42:09

标签: java mongodb apache-spark

有人可以帮助我使用mongo-spark-connector从Java将无序批量文档写入MongoDB吗?

以下是我的spark&mongo配置:

<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.3.1</version>
    </dependency>       
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.3.1</version>
    </dependency>
    <dependency>
        <groupId>org.mongodb.spark</groupId>
        <artifactId>mongo-spark-connector_2.11</artifactId>
        <version>2.3.0</version>
    </dependency>

我尝试编写批量文档,但是所有插入操作都是按顺序进行的:true。

 Map<String, String> writeOverrides = new HashMap<String, String>();
        writeOverrides.put("collection", collection);
        writeOverrides.put("maxBatchSize", 1000);
        writeOverrides.put("writeConcern.w", "1");        
        writeOverrides.put("forceInsert", "true");
        writeOverrides.put("ordered", "false");//For Unordered inserts
WriteConfig wc = WriteConfig.create(SparkMongoSession.sparkContext).withOptions(writeOverrides);
MongoSpark.save(dataSet, wc);

Mongo日志:

[conn173] command MyDB.MyCollection command: insert { insert: "MyCollection", **ordered: true**, writeConcern: { w: 1 }, $db: "MyDB" } **ninserted:1000** keysInserted:1000 ....

0 个答案:

没有答案