我正在尝试运行一个登录到Amazon Redshift DB然后执行SQL命令的python脚本。我使用名为Airflow的工具进行工作流管理。运行以下代码时,我可以正常登录到数据库,但在尝试执行SQL命令时会出现以下错误。
**AttributeError: 'NoneType' object has no attribute 'execute'**
代码:
## Login to DB
def db_log(**kwargs):
global db_con
try:
db_con = psycopg2.connect(
" dbname = 'name' user = 'user' password = 'pass' host = 'host' port = '5439'")
except:
print("I am unable to connect")
print('Connection Task Complete')
task_instance = kwargs['task_instance']
task_instance.xcom_push(key="dwh_connection" , value = "dwh_connection")
return (dwh_connection)
def insert_data(**kwargs):
task_instance = kwargs['task_instance']
db_con_xcom = task_instance.xcom_pull(key="dwh_connection", task_ids='DWH_Connect')
cur = db_con_xcom
cur.execute("""insert into tbl_1 select limit 2 ;""")
任何人都可以帮我解决这个问题。感谢..
完整代码:
## Third party Library Imports
import pandas as pd
import psycopg2
import airflow
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from airflow.operators.bash_operator import BashOperator
from datetime import datetime, timedelta
from sqlalchemy import create_engine
import io
# Following are defaults which can be overridden later on
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime(2018, 5, 29, 12),
'email': ['airflow@airflow.com']
}
dag = DAG('sample1', default_args=default_args)
## Login to DB
def db_log(**kwargs):
global db_con
try:
db_con = psycopg2.connect(
" dbname = 'name' user = 'user' password = 'pass' host = 'host' port = '5439'")
except:
print("I am unable to connect")
print('Connection Task Complete')
task_instance = kwargs['task_instance']
task_instance.xcom_push(key="dwh_connection" , value = "dwh_connection")
return (dwh_connection)
t1 = PythonOperator(
task_id='DWH_Connect',
python_callable=data_warehouse_login,provide_context=True,
dag=dag)
#######################
def insert_data(**kwargs):
task_instance = kwargs['task_instance']
db_con_xcom = task_instance.xcom_pull(key="dwh_connection", task_ids='DWH_Connect')
cur = db_con_xcom
cur.execute("""insert into tbl_1 select limit 2 """)
##########################################
t2 = PythonOperator(
task_id='DWH_Connect1',
python_callable=insert_data,provide_context=True,dag=dag)
t1 >> t2
答案 0 :(得分:0)
您确定已添加完整代码吗?您在第一个任务的python_callable中调用data_warehouse_login
函数,但这是未定义的。假设这是db_log
并且第一个任务成功,那么你实际上并没有对第二个任务进行任何操作(因为你的xcom_push
仅在出错时触发)。
通常不会建议xcom连接对象。或者,您可能需要考虑使用附带的PostgresHook,它应该涵盖您的用例并且与Amazon Redshift同样适用。
https://github.com/apache/incubator-airflow/blob/master/airflow/hooks/postgres_hook.py