Google Cloud Dataflow无法导入google.cloud.datastore'

时间:2017-10-20 00:40:05

标签: python google-app-engine google-cloud-dataflow

这是我的导入代码

from __future__ import absolute_import

import datetime
import json
import logging
import re

import apache_beam as beam
from apache_beam import combiners
from apache_beam.io.gcp.bigquery import parse_table_schema_from_json
from apache_beam.io.gcp.datastore.v1.datastoreio import ReadFromDatastore
from apache_beam.pvalue import AsDict
from apache_beam.pvalue import AsSingleton
from apache_beam.options.pipeline_options import PipelineOptions

from google.cloud.proto.datastore.v1 import query_pb2
from google.cloud import datastore
from googledatastore import helper as datastore_helper, PropertyFilter

# datastore entities that we need to perform the mapping computations
#from models import UserPlan, UploadIntervalCount, RollingMonthlyCount

这就是我的requirements.txt文件的样子

$ cat requirements.txt
Flask==0.12.2
apache-beam[gcp]==2.1.1
gunicorn==19.7.1
google-cloud-dataflow==2.1.1
six==1.10.0
google-cloud-datastore==1.3.0
google-cloud

这都在/lib目录中。 /lib目录具有以下内容

$ ls -1 lib/google/cloud
__init__.py
_helpers.py
_helpers.pyc
_http.py
_http.pyc
_testing.py
_testing.pyc
bigquery
bigtable
client.py
client.pyc
datastore
dns
environment_vars.py
environment_vars.pyc
error_reporting
exceptions.py
exceptions.pyc
gapic
iam.py
iam.pyc
language
language_v1
language_v1beta2
logging
monitoring
obselete.py
obselete.pyc
operation.py
operation.pyc
proto
pubsub
resource_manager
runtimeconfig
spanner
speech
speech_v1
storage
translate.py
translate.pyc
translate_v2
videointelligence.py
videointelligence.pyc
videointelligence_v1beta1
vision
vision_v1

请注意,google.cloud.datastore文件夹中存在google.cloud.proto/lib。但是,这个导入行正常工作

from google.cloud.proto.datastore.v1 import query_pb2

但是这个失败了

from google.cloud import datastore

这是例外(取自谷歌云数据流控制台在线)

(9b49615f4d91c1fb): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 582, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 166, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 294, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:10607)
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 295, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:10501)
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 300, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:9702)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 225, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 277, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 266, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 767, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/dataflow_pipeline/counters_pipeline.py", line 25, in <module>
    from google.cloud import datastore
ImportError: No module named datastore

为什么找不到包装?

1 个答案:

答案 0 :(得分:2)

必须在setup.py中安装外部依赖项,并且应在管道参数中将此文件指定为--setup_file。 在setup.py中,您可以使用自定义命令

安装软件包
pip install google-cloud-datastore==1.3.0

或将您的包添加到REQUIRED_PACKAGES

REQUIRED_PACKAGES = ["google-cloud-datastore==1.3.0"]

您需要在setup.py中指定它的原因是因为在DataFlow执行期间未使用appengine_config中的库。 App Engine仅在此处充当调度程序,仅将作业部署到DataFlow引擎。然后,DataFlow会创建一些执行管道的工作机器 - 这些工作程序无法以任何方式连接到App Engine。 DataFlow工作者必须拥有管道执行所需的所有包,这就是您需要在setup.py文件中指定所需包的原因。 DataFlow工作人员使用此文件“自行设置”。