在Python的多处理模块上发送未排队/已处理的信号

时间:2018-07-28 15:26:49

标签: python multithreading scrapy twisted twisted.internet

我有一块带有Scrapy代码的Python,

cv::rectangle()

现在,即使import scrapy import scrapy.crawler as crawler from multiprocessing import Process, Queue from twisted.internet import reactor # your spider class QuotesSpider(scrapy.Spider): name = "quotes" start_urls = ['http://quotes.toscrape.com/tag/humor/'] def parse(self, response): for quote in response.css('div.quote'): print(quote.css('span.text::text').extract_first()) # the wrapper to make it run more times def run_spider(): def f(q): try: runner = crawler.CrawlerRunner() deferred = runner.crawl(QuotesSpider) deferred.addBoth(lambda _: reactor.stop()) reactor.run() q.put(None) except Exception as e: q.put(e) q = Queue() p = Process(target=f, args=(q,)) p.start() result = q.get() p.join() if result is not None: raise result print('first run:') run_spider() print('\nsecond run:') run_spider() 返回空白或错误​​,run_spider仍在运行。

当QuotesSpider()错误或为空白时,如何使run_spider()不执行/排队?

谢谢

0 个答案:

没有答案