我试图在使用php脚本加载数据时在big-query上创建分区,但没有找到正确的解决方案。
<?php
require_once '/v/vv/vv/common.php';
require_once '/vv/vv/vv/vendor/autoload.php';
use Google\Cloud\BigQuery\BigQueryClient;
use Google\Cloud\Core\ExponentialBackoff;
/** Uncomment and populate these variables in your code */
$projectId = 'id';
$datasetId = 'ids';
$bigQuery = new BigQueryClient([
'projectId' => $projectId,
]);
$dataset = $bigQuery->dataset($datasetId);
$table_id='test_table';
$table = $dataset->table($table_id);
// create the import job
$schema = [
'fields' => [
['name' => 'date', 'type' => 'Date','mode'=>'required'],
['name' => 'name', 'type' => 'string'],
['name' => 'post_abbr', 'type' => 'string']
],
'timePartitioning'=>['type'=>'DAY','filed'=>'date']
]
答案 0 :(得分:0)
您正在将schema definition与timePartitioning混合在一起。这是一段代码:
use Google\Cloud\BigQuery\BigQueryClient;
$source = 'miData.csv';
$datasetId = 'miDataset';
$table_id='miTable';
$bigQuery = new BigQueryClient();
$dataset = $bigQuery->dataset($datasetId);
$table = $dataset->table($table_id);
$loadConfig = $table->load(fopen($source, 'r'));
$loadConfig->sourceFormat('CSV');
// create the import job
$schema = [
'fields' => [
['name' => 'miDate', 'type' => 'DATE','mode'=>'required'],
['name' => 'name', 'type' => 'STRING'],
['name' => 'post_abbr', 'type' => 'STRING']
]
];
$loadConfig->schema($schema);
$timePartitioning = ['type'=>'DAY','filed'=>'miDate'];
$loadConfig->timePartitioning($timePartitioning);
$loadConfig->createDisposition('CREATE_IF_NEEDED');
$loadConfig->writeDisposition('WRITE_TRUNCATE');
$job = $table->runJob($loadConfig);
有关加载作业的可用选项的更多信息,请检查API documentation和PHP loadJobConfiguration。