如何运行scrapy蜘蛛使用其他蜘蛛的返回值来抓取一些网址?

时间:2018-06-15 05:22:59

标签: python scrapy

我是scrapy的新手,很难在其他蜘蛛中重复使用蜘蛛响应。

MySpider1():
    # some logic is here

MYSpider2():
    # call MySpider1
    # It needs the response of MySpider1() and then crawl some url.


MYSpider3():
    # call MySpider1
    # It also need the response of MySpider1() and then crawl some url.

如何在MySpider2,MySpider3中调用MySpider1并使用MySpider1的返回值来实现某些逻辑?

请帮忙。 感谢

0 个答案:

没有答案