向GCP提交数据流作业时,出现此错误:
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 766, in run
self._load_main_session(self.local_staging_directory)
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 482, in _load_main_session
pickler.load_session(session_file)
File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 266, in load_session
return dill.load_session(file_path)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 402, in load_session
module = unpickler.load()
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 818, in _import_module
return __import__(import_name)
ImportError: No module named tensorflow_transform
我的假设是,诸如tensorflow-transform和apache-beam之类的要求已预先安装,并且几个月前就可以使用。
答案 0 :(得分:1)
这是解决方案,将其提供给面临相同问题的人们。
假设文件具有所有光束台阶,则需要将setup.py文件与正在运行的文件放在同一目录中。
import setuptools
setuptools.setup(
name='whatever-name',
version='0.0.1',
install_requires=[
'apache-beam==2.10.0',
'tensorflow-transform==0.12.0'
],
packages=setuptools.find_packages(),
)
在我拥有的python文件中
options = PipelineOptions()
必须更改为:
options = PipelineOptions(setup_file="./setup.py")