我正在评估portia并遇到部署到scrapyd的问题。
当我尝试使用
部署我的portia项目时scrapyd-deploy local -p new_project
从我的portia项目目录我收到以下错误消息
Packing version 1433441798
Deploying to project "new_project" in http://192.168.59.103:6800/addversion.json
Server response (200):
{"status": "error", "message": "ImportError: Error loading object
'slybot.spidermanager.ZipfileSlybotSpiderManager': No module named slybot.spidermanager"}
我有一个运行portia的docker容器和另一个运行scrapyd。
我的项目已成功通过portiacrawl
运行。
我还尝试使用在startproject
上成功部署的genspider
和scrapyd
命令创建一个简单的scrapy项目
我不知道在哪里可以找到生成的egg文件。
Scrapyd log
2015-06-04 18:28:51+0000 [HTTPChannel,21,172.17.42.1] Unhandled Error
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/twisted/web/http.py", line 1618, in allContentReceived
req.requestReceived(command, path, version)
File "/usr/lib/python2.7/dist-packages/twisted/web/http.py", line 773, in requestReceived
self.process()
File "/usr/lib/python2.7/dist-packages/twisted/web/server.py", line 132, in process
self.render(resrc)
File "/usr/lib/python2.7/dist-packages/twisted/web/server.py", line 167, in render
body = resrc.render(self)
--- <exception caught here> ---
File "/usr/lib/pymodules/python2.7/scrapyd/webservice.py", line 18, in render
return JsonResource.render(self, txrequest)
File "/usr/lib/pymodules/python2.7/scrapy/utils/txweb.py", line 10, in render
r = resource.Resource.render(self, txrequest)
File "/usr/lib/python2.7/dist-packages/twisted/web/resource.py", line 216, in render
return m(request)
File "/usr/lib/pymodules/python2.7/scrapyd/webservice.py", line 66, in render_POST
spiders = get_spider_list(project)
File "/usr/lib/pymodules/python2.7/scrapyd/utils.py", line 65, in get_spider_list
raise RuntimeError(msg.splitlines()[-1])
exceptions.RuntimeError: ImportError: Error loading object 'slybot.spidermanager.ZipfileSlybotSpiderManager': No module named slybot.spidermanager
2015-06-04 18:28:51+0000 [HTTPChannel,21,172.17.42.1] 172.17.42.1 - - [04/Jun/2015:18:28:51 +0000] "POST /addversion.json HTTP/1.1" 200 156 "-" "Python-urllib/2.7"
我缺少什么想法?
答案 0 :(得分:0)
我基本上可以通过使用最新的github而不是1.0.1版本和scrapy版本0.25.1来解决这个问题