我想创建一个工作,每天早上2点运行。该作业必须通过从Cloud Storage存储桶中读取我的文件来创建BigQuery表。我该如何实现?
答案 0 :(得分:2)
您可以将Firestore备份直接导入BigQuery。设置load job,使其sourceFormat等于DATASTORE_BACKUP
(即使是对于Firestore也是如此),并将writeDisposition设置为WRITE_TRUNCATE
您可以将其包装到Cloud Function中。您可以直接使用API或client libraries。如果您需要代码示例,请告诉我您的语言,我会为您提供帮助。
编辑
您需要将这些依赖项导入package.json
"@google-cloud/bigquery": "^4.7.0",
"@google-cloud/storage": "^5.0.1",
然后,这里是带有静态值的函数。您可以根据需要构建更具动态性的内容(例如,通过阅读函数参数)。
const {Storage} = require('@google-cloud/storage');
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const storage = new Storage();
//
const bucketName = "my_bucket" //to change
const fileExport = "path/to/my_export.export_metadata" //to change
const datasetId = "data" //to change
const tableId = "dsexport" //to change
exports.loadDSExport = async (req, res) => {
// Configure the load job. For full list of options, see:
// https://cloud.google.com/bigquery/docs/reference/rest/v2/Job#JobConfigurationLoad
const metadata = {
sourceFormat: 'DATASTORE_BACKUP',
autodetect: true,
location: 'EU', // Set your correct region
writeDisposition: "WRITE_TRUNCATE",
};
// Load data from a Google Cloud Storage file into the table
const [job] = await bigquery
.dataset(datasetId)
.table(tableId)
.load(storage.bucket(bucketName).file(fileExport), metadata);
// load() waits for the job to finish
// Can take time, increase function timeout if needed
// Check the job's status for errors
const errors = job.status.errors;
if (errors && errors.length > 0) {
//Handle error and return code here
throw errors;
}
console.log(`Job ${job.id} completed.`);
res.send(`Job ${job.id} completed.`);
};
然后,像这样(在私有模式下)部署功能
gcloud beta functions deploy --runtime nodejs10 --trigger-http --entry-point loadDSExport --region europe-west1 loadDSExport