如何将多个参数传递给Scrapy spider(不再支持运行错误&scrapy crawl'多个蜘蛛)?

时间:2015-06-23 07:57:19

标签: python scrapy

我想将多个用户定义的参数传递给我的scrapy spyder,所以我试着关注这篇文章:How to pass a user defined argument in scrapy spider

然而,当我按照建议进行操作时,我收到错误:

root@ scrapy crawl dmoz -a address= 40-18 48th st -a borough=4
Usage
=====
  scrapy crawl [options] <spider>

crawl: error: running 'scrapy crawl' with more than one spider is no longer supported

我也尝试了各种引号排列:

root@ scrapy crawl dmoz -a address= "40-18 48th st" -a borough="4"
Usage
=====
  scrapy crawl [options] <spider>
crawl: error: running 'scrapy crawl' with more than one spider is no longer supported

将参数传递给Scrapy蜘蛛的正确方法是什么?我想为蜘蛛的登录/抓取过程传递用户名和密码。谢谢你的任何建议。

1 个答案:

答案 0 :(得分:10)

我猜是

没有scrapy问题。这是你的shell解释输入,在空格中分割标记的方式。因此,您不能在密钥及其值之间使用任何密钥。试试:

scrapy crawl dmoz -a address="40-18 48th st" -a borough="4"