我尝试基于以下命令为大型JSON文件创建自定义导入命令:https://github.com/codediodeio/firestore-migrator
但是我的自定义命令遇到了以下问题:
(node:19413) UnhandledPromiseRejectionWarning: Error: Cannot modify a WriteBatch that has been committed.
at WriteBatch.verifyNotCommitted (/Users/mac-clement/Documents/projets/dpas/gcp/import-data/csv-import/node_modules/@google-cloud/firestore/build/src/write-batch.js:116:19)
at WriteBatch.set (/Users/mac-clement/Documents/projets/dpas/gcp/import-data/csv-import/node_modules/@google-cloud/firestore/build/src/write-batch.js:234:14)
at Object.<anonymous> (/Users/mac-clement/Documents/projets/dpas/gcp/import-data/csv-import/dist/src/importJson.js:90:17)
at Generator.next (<anonymous>)
at /Users/mac-clement/Documents/projets/dpas/gcp/import-data/csv-import/dist/src/importJson.js:7:71
at new Promise (<anonymous>)
at __awaiter (/Users/mac-clement/Documents/projets/dpas/gcp/import-data/csv-import/dist/src/importJson.js:3:12)
at batchSet (/Users/mac-clement/Documents/projets/dpas/gcp/import-data/csv-import/dist/src/importJson.js:85:33)
at Object.<anonymous> (/Users/mac-clement/Documents/projets/dpas/gcp/import-data/csv-import/dist/src/importJson.js:74:19)
at Generator.next (<anonymous>)
(node:19413) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 174)
这可能是一个有希望的问题,因为在batchCommit函数中设置了一个新批处理...但是我很难找到它!谢谢您的帮助!
/**
* Dependencies
*/
import * as admin from "firebase-admin";
import * as fs from "file-system";
import * as _ from "lodash";
import {streamArray} from "stream-json/streamers/StreamArray";
import {parser} from "stream-json";
/**
* Global variables
*/
let args;
let db = admin.firestore();
let batch = db.batch();
let batchCount = 0;
let totalSetCount = 0;
/**
* Main function
*
* @param file
* @param collection
* @param options
*/
export const execute = (file: string, collection: string, options) => {
args = options;
if( args.dryRun ) args.verbose = true;
console.log('Importing data...');
console.log('File path: ' + file);
console.log('Collection: ' + collection);
console.log('Limit: ' + args.limit);
console.log('Chunk: ' + args.chunk);
return fs.createReadStream(file)
.pipe(parser())
.pipe(streamArray())
.on('data', async (row) => {
await Promise.resolve(manageRow(row.value, collection));
})
.on('end', async () => {
// Final Batch commit and completion message.
await batchCommit(false);
console.log(args.dryRun
? 'Dry-Run complete, Firestore was not updated.'
: 'Import success, Firestore updated!'
);
console.log(`Total documents written: ${totalSetCount}`);
});
}
/**
*
* @param row
* @param collection
*/
const manageRow = async (row: object, collection: string) => {
const colRef = db.collection(collection);
return new Promise(async (resolve, reject) => {
for (let [id, item] of Object.entries(row)) {
const docRef = colRef.doc(id);
await batchSet(docRef, item);
}
resolve();
});
}
/**
* Add an item in the batch and call commit if batch size reached chunk
*
* @param ref
* @param item
*/
const batchSet = async (ref: FirebaseFirestore.DocumentReference, item: object) => {
// Log if requested
args.verbose && console.log(`Writing: ${ref.path}`);
// Set the Document Data
++totalSetCount;
await batch.set(ref, item);
// Commit batch on chunk size
if (++batchCount % args.chunk === 0) {
await batchCommit();
}
}
/**
* Commit changes to FireStore database and initialize a new batch if recycle is set to true
*
* @param recycle
*/
const batchCommit = async (recycle: boolean = true) => {
// Nothing to commit or dry run so do not commit
if (!batchCount || args.dryRun) return;
// Log if requested
args.verbose && console.log(batchCount + ' documents have been written so long ...');
await batch.commit();
if(recycle) {
batch = db.batch();
batchCount = 0;
}
}
答案 0 :(得分:1)
您似乎正在尝试将批处理对象重用于多次提交。这是无效的。为每次提交创建一个新的批处理对象。