Airflow的BigQuery - 缺少projectId

时间:2017-09-12 00:07:48

标签: google-bigquery airflow apache-airflow

尝试以下示例:

https://cloud.google.com/blog/big-data/2017/07/how-to-aggregate-data-for-bigquery-using-apache-airflow

运行其中一个命令时:

airflow test bigquery_github_trends_v1 bq_check_githubarchive_day 2017-06-02

收到错误: TypeError:缺少必需参数“projectId”

错误堆叠:

[2017-09-11 16:32:26,630] {models.py:1126} INFO - Dependencies all met for <TaskInstance: bigquery_github_trends_v1.bq_check_githubarchive_day 2017-06-02 00:00:00 [None]>
[2017-09-11 16:32:26,631] {models.py:1126} INFO - Dependencies all met for <TaskInstance: bigquery_github_trends_v1.bq_check_githubarchive_day 2017-06-02 00:00:00 [None]>
[2017-09-11 16:32:26,632] {models.py:1318} INFO - 
-----------------------------------------------------------------------
---------
Starting attempt 1 of 6
-----------------------------------------------------------------------
---------

[2017-09-11 16:32:26,632] {models.py:1342} INFO - Executing <Task(BigQueryCheckOperator): bq_check_githubarchive_day> on 2017-06-02 00:00:00
[2017-09-11 16:32:26,643] {check_operator.py:75} INFO - Executing SQL check: 
#legacySql
SELECT table_id 
FROM [githubarchive:day.__TABLES__] 
WHERE table_id = "20170601"

[2017-09-11 16:32:26,646] {gcp_api_base_hook.py:73} INFO - Getting connection using `gcloud auth` user, since no key file is defined for hook.
[2017-09-11 16:32:26,671] {models.py:1417} ERROR - Missing required parameter "projectId"
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 1374, in run
result = task_copy.execute(context=context)
  File "/usr/local/lib/python2.7/site-packages/airflow/operators/check_operator.py", line 76, in execute
records = self.get_db_hook().get_first(self.sql)
  File "/usr/local/lib/python2.7/site-packages/airflow/hooks/dbapi_hook.py", line 135, in get_first
cur.execute(sql)
  File "/usr/local/lib/python2.7/site-packages/airflow/contrib/hooks/bigquery_hook.py", line 752, in execute
self.job_id = self.run_query(bql)
  File "/usr/local/lib/python2.7/site-packages/airflow/contrib/hooks/bigquery_hook.py", line 244, in run_query
return self.run_with_configuration(configuration)
  File "/usr/local/lib/python2.7/site-packages/airflow/contrib/hooks/bigquery_hook.py", line 498, in run_with_configuration
.insert(projectId=self.project_id, body=job_data) \
  File "/usr/local/lib/python2.7/site-packages/googleapiclient/discovery.py", line 716, in method
raise TypeError('Missing required parameter "%s"' % name)
TypeError: Missing required parameter "projectId"

2 个答案:

答案 0 :(得分:2)

如果你检查bigquery_hook的代码,你会发现它正在检查project_id,https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/hooks/bigquery_hook.py#L54

默认连接是bigquery_default,除非你覆盖它,转到Airflow UI,转到admin - &gt;连接 - &gt; bigquery_default(或您创建的任何内容) - &gt;在那里添加项目ID

enter image description here

答案 1 :(得分:0)

我知道这是一个老问题,但是我也很挣扎,因为对我来说UI无效。 刚刚找到了如何通过CLI进行此操作,并希望与我分享我的发现,因为没有记录。

实际上有三种方法:

  1. 通过here表示的环境变量
export AIRFLOW_CONN_BIGQUERY_DEFAULT=google_cloud_platform://:@:?extra__google_cloud_platform__project=yourprojectid&extra__google_cloud_platform__key_path=/path/to/keyfile.json
  1. 通过cli和URI
airflow connections -d --conn_id bigquery_default
airflow connections -a --conn_id bigquery_default --conn_uri 'google_cloud_platform://:@:?extra__google_cloud_platform__project=yourprojectid&extra__google_cloud_platform__key_path=/path/to/keyfile.json'
  1. 通过cli和params
airflow connections -d --conn_id bigquery_default
airflow connections -a --conn_id bigquery_default --conn_type google_cloud_platform --conn_extra '{"extra__google_cloud_platform__project":"yourprojectid", "extra__google_cloud_platform__key_path":"/path/to/keyfile.json"}'

如果您忽略了关键路径的内容,那么气流将使用gcloud命令行工具当前使用的任何凭据。通常是您的个人用户。

完成此操作后,您可以使用与airflow run ...airflow test ...的连接来运行任何任务