数据流运行器 - 由于401而刷新

时间:2017-02-17 03:01:43

标签: google-cloud-dataflow

在DataflowRunner上运行管道(适用于Python 0.5.5的Google Cloud Dataflow SDK)。

管道:

(p
    | 'Read trip from BigQuery' >> beam.io.Read(beam.io.BigQuerySource(query=known_args.input))
    | 'Convert' >> beam.Map(lambda row: (row['HardwareId'],row))
    | 'Group devices' >> beam.GroupByKey()
    | 'Pull way info from mapserver' >> beam.FlatMap(get_osm_way)
    | 'Map way info to dictionary' >> beam.FlatMap(convert_to_dict)
    | 'Save to BQ' >> beam.io.Write(beam.io.BigQuerySink(
            known_args.output,            schema=schema_string,
            create_disposition=beam.io.BigQueryDisposition.CREATE_IF_NEEDED,
            write_disposition=beam.io.BigQueryDisposition.WRITE_TRUNCATE))
  )

它将进行自动缩放,并由跑步者调整了15名工作人员。

更详细的代码:my another StackOverflow question

运行约2小时后,它报告:

19:41:19.908
Attempting refresh to obtain initial access_token
 {
 insertId: "jf9yr4g1sv0qku"   
 jsonPayload: {
  message: "Attempting refresh to obtain initial access_token"    
  worker: "beamapp-root-0216221014-5-02161410-29cb-harness-xqx2"    
  logger: "oauth2client.client:client.py:new_request"    
  thread: "110:140052132222720"    
  job: "2017-02-16_14_10_18-17481182243152998182"    
 }
 resource: {…}   
 timestamp: "2017-02-17T00:41:19.908143997Z"   
 severity: "INFO"   
 labels: {…}   
 logName: "projects/fiona-zhao/logs/dataflow.googleapis.com%2Fworker"   
}

并开始不断报告" refreshing due to a 401" 。其中之一是:

21:45:12.886
Refreshing due to a 401 (attempt 1/2)
 {
 insertId: "zsorfgg1urhvty"   
 jsonPayload: {
  worker: "beamapp-root-0216221014-5-02161410-29cb-harness-xqx2"    
  logger: "oauth2client.client:client.py:new_request"    
  thread: "110:140052273633024"    
  job: "2017-02-16_14_10_18-17481182243152998182"    
  message: "Refreshing due to a 401 (attempt 1/2)"    
 }
 resource: {…}  
 timestamp: "2017-02-17T02:45:12.886137962Z"   
 severity: "INFO"   
 labels: {
  compute.googleapis.com/resource_name: "dataflow-beamapp-root-0216221014-5-02161410-29cb-harness-xqx2"    
  dataflow.googleapis.com/job_id: "2017-02-16_14_10_18-17481182243152998182"    
  dataflow.googleapis.com/job_name: "beamapp-root-0216221014-530646"    
  dataflow.googleapis.com/region: "global"    
  compute.googleapis.com/resource_type: "instance"    
  compute.googleapis.com/resource_id: "2301951363070532306"    
 }
 logName: "projects/fiona-zhao/logs/dataflow.googleapis.com%2Fworker"   
}

我该怎么办?

1 个答案:

答案 0 :(得分:1)

这些日志消息是执行的正常部分,本身并不反映错误。我的建议是添加额外的日志记录来调试挂起的外部API调用或执行步骤。

虽然我们无法在此开放论坛上评论特定作业的具体执行细节,但Cloud Dataflow团队可以在dataflow-feedback@google.com邮件列表上提供更多支持。