使用pyspark AWS胶水时显示DataFrame

时间:2019-12-24 16:52:29

标签: python-3.x apache-spark pyspark aws-glue

如何显示带有AWS ETL胶水的DataFrame?

我在下面尝试了此代码,但未显示任何内容。

df.show()

代码

datasource0 = glueContext.create_dynamic_frame.from_catalog(database = "flux-test", table_name = "tab1", transformation_ctx = "datasource0")
sourcedf = ApplyMapping.apply(frame = datasource0, mappings = [("id", "long", "id", "long"),("Rd.Id_Releve", "string", "Rd.Id_R", "string")])
 sourcedf = sourcedf.toDF()
 data = []
 schema = StructType(
[
    StructField('PM',
        StructType([
            StructField('Pf', StringType(),True),
            StructField('Rd', StringType(),True)
    ])
    ),
    ])
 cibledf = sqlCtx.createDataFrame(data, schema)
 cibledf = sqlCtx.createDataFrame(sourcedf.rdd.map(lambda x:    Row(PM=Row(Pf=str(x.id_prm), Rd=None ))), schema)
 print(cibledf.show())
 job.commit()

1 个答案:

答案 0 :(得分:0)

在胶水控制台中,运行胶水作业后,在作业列表中将有一个“日志/错误日志”列。

单击日志,这将带您到与您的工作关联的cloudwatch日志。尽管浏览打印声明。

也请在此处检查:Convert dynamic frame to a dataframe and do show()

添加的工作/测试代码示例

代码示例:

zipcode_dynamicframe = glueContext.create_dynamic_frame.from_catalog(
       database = "customer_db",
       table_name = "zipcode_master")
zipcode_dynamicframe.printSchema()
zipcode_dynamicframe.toDF().show(10)

cloudwatch日志中zipcode_dynamicframe.show()的屏幕截图:

enter image description here