当scrapy-redis蜘蛛发出所有请求时,我想安全地停止蜘蛛。我希望这些请求将由其他scrapy-redis蜘蛛完成。
def start_requests(self):
for i in range(10):
message = {}
message['url'] = 'https://www.baidu.com/?tn=%s' % i
message['object'] = 'request'
message['method'] = 'get'
message['page'] = str(1)
message['tags'] = 'tags_test'
message['delid'] = 1
yield self.launce_request(**message)
exit()
我重写了start_requests方法并发出了10个请求。将请求放入Redis之后,蜘蛛程序停止了。此代码失败。