检查数据流错误

时间:2018-07-18 16:02:50

标签: google-cloud-platform google-cloud-dataflow google-cloud-pubsub

我正在尝试实现数据管道,在该管道中,我尝试通过DataFlow从那里在PubQ中将json插入BQ。我正在使用模板将数据从PubSub传输到BQ。我的DataFlow失败了。它正在错误流中。但是我看不到该错误的更多详细信息。例如,是否由于pubsub中数据的编码错误而失败,由于架构不匹配等而失败?在哪里可以找到这些详细信息?我正在检查Stackdriver错误和日志,但无法找到更多详细信息。

此外,我可以看到此错误:

  

resource.type =“ dataflow_step”   resource.labels.job_id =“ 2018-07-17_20_36_16-6729875790634111180”   logName =“ projects / camel-154800 / logs / dataflow.googleapis.com%2Fworker”   时间戳> =“ 2018-07-18T03:36:17Z”严重性> =“ INFO”   resource.labels.step_id =(“ WriteFailedRecords / FailedRecordToTableRow”   要么   “ WriteFailedRecords / WriteFailedRecordsToBigQuery / PrepareWrite / ParDo(匿名)”   要么   “ WriteFailedRecords / WriteFailedRecordsToBigQuery / StreamingInserts / CreateTables / ParDo(CreateTables)”   要么   “ WriteFailedRecords / WriteFailedRecordsToBigQuery / StreamingInserts / StreamingWriteTables / ShardTableWrites”   要么   “ WriteFailedRecords / WriteFailedRecordsToBigQuery / StreamingInserts / StreamingWriteTables / TagWithUniqueIds”   要么   “ WriteFailedRecords / WriteFailedRecordsToBigQuery / StreamingInserts / StreamingWriteTables / Reshuffle / Window.Into()/ Window.Assign”   要么   “ WriteFailedRecords / WriteFailedRecordsToBigQuery / StreamingInserts / StreamingWriteTables / Reshuffle / GroupByKey”   要么   “ WriteFailedRecords / WriteFailedRecordsToBigQuery / StreamingInserts / StreamingWriteTables / Reshuffle / ExpandIterable”   要么   “ WriteFailedRecords / WriteFailedRecordsToBigQuery / StreamingInserts / StreamingWriteTables / GlobalWindow / Window.Assign”   要么   “ WriteFailedRecords / WriteFailedRecordsToBigQuery / StreamingInserts / StreamingWriteTables / StreamingWrite”)

它告诉我它失败了,但是我不知道为什么它失败了?是否存在架构不匹配或数据类型问题或编码错误或什么?如何调试?

0 个答案:

没有答案