我试图将我的已删除数据放在云上的firebase
帐户中,但是当我运行蜘蛛时,我得到了这个ImportError
。我尝试制作新项目,甚至在firebase
的特定版本上重新安装shub
和Python
,但没有帮助。
蜘蛛在我的机器上运行完美,并且没有显示任何ImportErrors。 这是错误日志。
Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/scrapy/utils/defer.py", line 102, in iter_errback
yield next(it)
File "/usr/local/lib/python2.7/site-packages/sh_scrapy/middlewares.py", line 30, in process_spider_output
for x in result:
File "/usr/local/lib/python2.7/site-packages/scrapy/spidermiddlewares/offsite.py", line 29, in process_spider_output
for x in result:
File "/usr/local/lib/python2.7/site-packages/scrapy/spidermiddlewares/referer.py", line 339, in <genexpr>
return (_set_referer(r) for r in result or ())
File "/usr/local/lib/python2.7/site-packages/scrapy/spidermiddlewares/urllength.py", line 37, in <genexpr>
return (r for r in result or () if _filter(r))
File "/usr/local/lib/python2.7/site-packages/scrapy/spidermiddlewares/depth.py", line 58, in <genexpr>
return (r for r in result or () if _filter(r))
File "/app/__main__.egg/Terminator/spiders/IcyTermination.py", line 18, in parse
from firebase import firebase
ImportError: No module named firebase
任何帮助?
答案 0 :(得分:1)
由于声誉,我无法发表评论。但是你创建了你的requirements.txt吗?
Here您将找到如何将自己的依赖项部署到scrapinghub。
基本上,您在项目的根目录下创建一个requirements.txt文件,每行有一个依赖项并添加
requirements_file:requirements.txt
到你的scrapinghub.yml文件