我创建了一个运行良好的scrapy蜘蛛(它完成了应该做的事情),但是在完成工作后它并没有执行析构函数代码( del )
版本是: - python 2.7.3 - scrapy 0.24.6 - Fedora 18
class MySpider(scrapy.Spider):
stuff
def __del__(self):
stuff_1
我怎样才能执行我的" stuff-1" MySpider完成时的代码?
答案 0 :(得分:0)
使用信号。特别是spider_closed
信号:
from scrapy import signals
from scrapy.xlib.pydispatch import dispatcher
class MySpider(scrapy.Spider):
def __init__(self):
dispatcher.connect(self.spider_closed, signals.spider_closed)
def spider_closed(self, spider):
stuff_1()
答案 1 :(得分:0)
@alecxe的答案现在为deprecated
改为使用from_crawler
class method:
class MySpider(scrapy.Spider):
...
@classmethod
def from_crawler(cls, crawler, *args, **kwargs):
spider = super(MySpider, cls).from_crawler(crawler, *args, **kwargs)
crawler.signals.connect(spider.spider_closed, signal=scrapy.signals.spider_closed)
return spider
def spider_closed(self, spider):
spider.logger.info('Spider closed')