Airflow的新手。
使用Docker映像和LocalExecutor运行气流,并执行一个任务,该任务通过以下任务从MySQL
到Google Cloud Storage
获取数据。预计将提取约100万条记录。
# Extract and Load customer table
extract_customer_table_from_mysql = MySqlToGoogleCloudStorageOperator(
task_id='InitialExtractCustomerToGCS',
mysql_conn_id='iprocure_staging_db_conn',
sql='SELECT * FROM iProcureMain.customer',
bucket=bucket,
filename='iprocure-bigquery-bucket/customer/{{ ts_nodash }}/iprocure-bigquery-bucket-customer.json',
schema_filename='iprocure-bigquery-bucket/customer_schema.json',
google_cloud_storage_conn_id='iprocure_gcs_conn',
dag=dag)
一段时间后失败。以下是此任务执行的日志
[2019-05-15 10:12:13,392] {{models.py:1595}} INFO - Executing <Task(MySqlToGoogleCloudStorageOperator): InitialExtractCustomerToGCS> on 2019-05-15T09:05:58.731452+00:00
[2019-05-15 10:12:13,393] {{base_task_runner.py:118}} INFO - Running: ['bash', '-c', 'airflow run MySQLtoBQInitalLoad InitialExtractCustomerToGCS 2019-05-15T09:05:58.731452+00:00 --job_id 19 --raw -sd DAGS_FOLDER/initial_load.py --cfg_path /tmp/tmp_f96d99z']
[2019-05-15 10:12:19,852] {{base_task_runner.py:101}} INFO - Job 19: Subtask InitialExtractCustomerToGCS [2019-05-15 10:12:19,849] {{settings.py:174}} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2019-05-15 10:12:22,540] {{base_task_runner.py:101}} INFO - Job 19: Subtask InitialExtractCustomerToGCS [2019-05-15 10:12:22,538] {{__init__.py:51}} INFO - Using executor LocalExecutor
[2019-05-15 10:12:27,379] {{base_task_runner.py:101}} INFO - Job 19: Subtask InitialExtractCustomerToGCS [2019-05-15 10:12:27,365] {{models.py:271}} INFO - Filling up the DagBag from /usr/local/airflow/dags/initial_load.py
[2019-05-15 10:12:28,528] {{base_task_runner.py:101}} INFO - Job 19: Subtask InitialExtractCustomerToGCS [2019-05-15 10:12:28,524] {{cli.py:484}} INFO - Running <TaskInstance: MySQLtoBQInitalLoad.InitialExtractCustomerToGCS 2019-05-15T09:05:58.731452+00:00 [running]> on host 3c7603479eef
[2019-05-15 10:12:28,728] {{logging_mixin.py:95}} INFO - [2019-05-15 10:12:28,718] {{base_hook.py:83}} INFO - Using connection to: datawarehousereplica.crgjkux43gqm.us-west-2.rds.amazonaws.com
[2019-05-15 10:32:12,569] {{logging_mixin.py:95}} INFO - [2019-05-15 10:32:12,493] {{jobs.py:2627}} INFO - Task exited with return code -9