我遇到过一种情况,我的scrapy代码在从命令行使用时工作正常,但是当我在部署(scrapy-deploy)和使用scrapyd api进行调度后使用相同的蜘蛛时,它会在“scrapy.extensions”中引发错误。 feedexport.FeedExporter“class。
1.“open_spider”信号错误
2016-05-14 12:09:38 [scrapy] INFO: Spider opened
2016-05-14 12:09:38 [scrapy] ERROR: Error caught on signal handler: <bound method ?.open_spider of <scrapy.extensions.feedexport.FeedExporter object at 0x7fafb1ce4a90>>
Traceback (most recent call last):
File "/home/jonsnow/venv/scrapy1/lib/python2.7/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/home/jonsnow/venv/scrapy1/lib/python2.7/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/home/jonsnow/venv/scrapy1/lib/python2.7/site-packages/scrapy/extensions/feedexport.py", line 185, in open_spider
uri = self.urifmt % self._get_uri_params(spider)
TypeError: float argument required, not dict
2.“item_scraped”信号错误:
2016-05-14 12:09:49 [scrapy] DEBUG: Scraped from <200 https://someurl.>
2016-05-14 12:09:49 [scrapy] ERROR: Error caught on signal handler: <bound method ?.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x7fafb1ce4a90>>
Traceback (most recent call last):
File "/home/jonsnow/venv/scrapy1/lib/python2.7/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/home/jonsnow/venv/scrapy1/lib/python2.7/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/home/jonsnow/venv/scrapy1/lib/python2.7/site-packages/scrapy/extensions/feedexport.py", line 210, in item_scraped
slot = self.slot
AttributeError: 'FeedExporter' object has no attribute 'slot'
3.“close_spider”信号错误:
2016-05-14 12:09:49 [scrapy] INFO: Closing spider (finished)
2016-05-14 12:09:49 [scrapy] ERROR: Error caught on signal handler: <bound method ?.close_spider of <scrapy.extensions.feedexport.FeedExporter object at 0x7fafb1ce4a90>>
Traceback (most recent call last):
File "/home/jonsnow/venv/scrapy1/lib/python2.7/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/home/jonsnow/venv/scrapy1/lib/python2.7/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/home/jonsnow/venv/scrapy1/lib/python2.7/site-packages/scrapy/extensions/feedexport.py", line 193, in close_spider
slot = self.slot
AttributeError: 'FeedExporter' object has no attribute 'slot'
也试过了 scrapyd(1.1.0,scrapy版本0.24.6)
答案 0 :(得分:11)
当feedexporter无法写入文件时会发生这种情况,当我在excel中打开以前导出的csv文件时,它就发生在我身上。尝试关闭打开的导出文件,它将正常工作
答案 1 :(得分:0)
有时它会发生,即使文件已关闭,甚至结果未保存在文件中但存在一些权限问题。尝试使用root访问权限的命令(在Linux中运行sudo并在Windows中以管理员身份运行),我已经尝试过,问题就解决了。