使用bigquery操作员设置气流

时间:2016-08-27 07:31:08

标签: google-bigquery airflow

我正在试验数据管道的气流。遗憾的是,到目前为止,我无法让它与bigquery运算符一起工作。我已经搜索了最好的解决方案,但我仍然卡住..我正在使用本地运行的顺序执行程序。

这是我的代码:

from airflow import DAG
from airflow.contrib.operators.bigquery_operator import BigQueryOperator
from datetime import datetime, timedelta

default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'start_date': datetime(2015, 6, 1),
    'email': ['example@gmail.com'],
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 1,
    'retry_delay': timedelta(minutes=5),
    # 'queue': 'bash_queue',
    # 'pool': 'backfill',
    # 'priority_weight': 10,
    # 'end_date': datetime(2016, 1, 1),
}

dag = DAG(dag_id='bigQueryPipeline', default_args=default_args, schedule_interval=timedelta(1))

t1 = BigQueryOperator(
 task_id='bigquery_test',
bql='SELECT COUNT(userId) FROM [events:EVENTS_20160501]',
destination_dataset_table=False,
bigquery_conn_id='bigquery_default',
delegate_to=False,
udf_config=False,
dag=dag,
)`

错误消息:

[2016-08-27 00:13:14,665] {models.py:1327} ERROR - 'project'
Traceback (most recent call last):
  File "/Users/jean.rodrigue/anaconda/bin/airflow", line 15, in <module>
    args.func(args)
  File "/Users/jean.rodrigue/anaconda/lib/python2.7/site-packages/airflow/bin/cli.py", line 352, in test
    ti.run(force=True, ignore_dependencies=True, test_mode=True)
  File "/Users/jean.rodrigue/anaconda/lib/python2.7/site-packages/airflow/utils/db.py", line 53, in wrapper
    result = func(*args, **kwargs)
  File "/Users/jean.rodrigue/anaconda/lib/python2.7/site-packages/airflow/models.py", line 1245, in run
    result = task_copy.execute(context=context)
  File "/Users/jean.rodrigue/anaconda/lib/python2.7/site-packages/airflow/contrib/operators/bigquery_operator.py", line 57, in execute
    conn = hook.get_conn()
  File "/Users/jean.rodrigue/anaconda/lib/python2.7/site-packages/airflow/contrib/hooks/bigquery_hook.py", line 54, in get_conn
    project = connection_extras['project']

3 个答案:

答案 0 :(得分:9)

我花了一段时间才最终找到它,因为它没有记录得非常清楚。在气流用户界面中,转到管理员 - &gt;连接。该连接id是参数bigquery_connection_id引用的内容。您必须在“extras”字段中添加一个json对象,该对象定义k,v对“project”:“”

如果您未在运行Airflow的框中明确授权帐户,则还必须为“service_account”和“key_path”添加密钥。 (gcloud auth)

答案 1 :(得分:2)

如果您需要以编程方式执行此操作,我将其用作堆栈中的入口点,以创建连接(如果它尚未存在):

from airflow.models import Connection
from airflow.settings import Session

session = Session()
gcp_conn = Connection(
    conn_id='bigquery',
    conn_type='google_cloud_platform',
    extra='{"extra__google_cloud_platform__project":"<YOUR PROJECT HERE>"}')
if not session.query(Connection).filter(
        Connection.conn_id == gcp_conn.conn_id).first():
    session.add(gcp_conn)
    session.commit()

答案 2 :(得分:0)

最近我通过像这样指定bigquery_conn_idgoogle_cloud_storage_conn_id修复了类似的问题:

t1 = BigQueryOperator(
  task_id='bigquery_test',
  bql='SELECT COUNT(userId) FROM [events:EVENTS_20160501]',
  destination_dataset_table=False,
  bigquery_conn_id='bigquery_default',             <-- Need these both
  google_cloud_storage_conn_id='bigquery_default', <-- becasue of inheritance 
  delegate_to=False,
  udf_config=False,
  dag=dag,
)

在此答案中查看更多内容:https://stackoverflow.com/a/45664830/634627