AttributeError:“模块”对象没有属性“ ensure_str”

时间:2019-07-29 10:01:13

标签: python google-cloud-dataflow apache-beam

我尝试通过Beam将数据从一个bigquery转移到另一个花药,但是,出现以下错误:

WARNING:root:Retry with exponential backoff: waiting for 4.12307941111 seconds before retrying get_query_location because we caught exception: AttributeError: 'module' object has no attribute 'ensure_str'
 Traceback for above exception (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", line 197, in wrapper
    return fun(*args, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 261, in get_query_location
    response = self.client.jobs.Insert(request)
  File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py", line 342, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 703, in _RunMethod
    download)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 674, in PrepareHttpRequest
    method_config.query_params, request, global_params)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 551, in __ConstructQueryParams
    global_params, self.__client.global_params)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 357, in global_params
    return encoding.CopyProtoMessage(self._default_global_params)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py", line 112, in CopyProtoMessage
    return JsonToMessage(type(message), MessageToJson(message))
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py", line 123, in JsonToMessage
    return _ProtoJsonApiTools.Get().decode_message(message_type, message)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py", line 309, in decode_message
    message_type, result)
  File "/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py", line 209, in decode_message
    encoded_message = six.ensure_str(encoded_message)

这是我的代码:

class SplitBDoFn(beam.DoFn):
  word_tag = 'word_tag'
  def process(self, element):
    if element:
      yield pvalue.TaggedOutput(self.word_tag, element)

def run(argv=None):
  parser = argparse.ArgumentParser()
  known_args, pipeline_args = parser.parse_known_args(argv)
  pipeline_args.extend([
      '--runner=DirectRunner', 
      '--project=myproject',
      '--gcs_location=US',
      '--staging_location=gs://test-bucket/stage',
      '--temp_location=gs://test-bucket/temp',
      '--job_name=test-job',
  ])

  pipeline_options = PipelineOptions(pipeline_args)
  pipeline_options.view_as(SetupOptions).save_main_session = True
  pipeline_options.view_as(StandardOptions).streaming = True
  with beam.Pipeline(options = pipeline_options) as p:
    bq_source = beam.io.BigQuerySource(query = 'select * from myproject:raw_data.events where utc_date = "2019-07-20"')
    bq_data = p | beam.io.Read(bq_source)

    multiple_lines = (
        bq_data
        | 'SplitBDoFn' >> (beam.ParDo(SplitBDoFn()).with_outputs(
                                      SplitBDoFn.word_tag)))

    word_tag = multiple_lines.word_tag

    (word_tag
        | "output_word_tag" >> beam.io.WriteToBigQuery(
                              table = 'test',
                              dataset = 'temp',
                              project = 'myproject',
                              schema = data_schema,
                              # validate = True,
                              write_disposition = beam.io.BigQueryDisposition.WRITE_APPEND,
                              create_disposition = beam.io.BigQueryDisposition.CREATE_IF_NEEDED
                        ))

光束版本:2.13.0

有人可以解决这个问题吗?或我的代码中有任何错误?

2 个答案:

答案 0 :(得分:1)

好像suresure_str在其1.12.0版本中添加了六个,应该通过apitools来合并。

我怀疑根本原因是您的虚拟环境中安装了6个旧版本(1.11或更旧版本)。您能否在再次尝试管道或运行quick-start example之前尝试创建新的virtualenv?

答案 1 :(得分:1)

这为我解决了相同的问题: 点安装六== 1.12.0