Scrapy KeyError - 无法找到蜘蛛

时间:2017-01-06 14:22:40

标签: python ubuntu scrapy virtualenv virtualenvwrapper

我有一个不会从命令行用scrapy crawl <spider-name>执行的scrapy项目。

我刚刚在ubuntu 16.04上转移到新的开发环境,所以我想仔细检查问题是否与我的设置无关。为此,我使用virtualenvwrapper使用python 2.7.12创建了一个干净的虚拟环境,并遵循Scrapy文档中的tutorial instructions。 (v1.1模仿我的其他项目)。

尽管有新的环境,我仍然看到scrapy的奇怪行为,它赢了:

  • 列出蜘蛛scrapy list
  • 列出设置scrapy settings
  • 开始抓取scrapy crawl quotes

scrapy crawl quotes引发以下错误:

2017-01-06 14:20:50 [scrapy] INFO: Scrapy 1.1.1 started (bot: scrapybot)
2017-01-06 14:20:50 [scrapy] INFO: Overridden settings: {}
Traceback (most recent call last):
  File "/home/alan/QueryClick/.virtualenvs/test/bin/scrapy", line 11, in <module>
    sys.exit(execute())
  File "/home/alan/QueryClick/.virtualenvs/test/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 142, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
  File "/home/alan/QueryClick/.virtualenvs/test/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 88, in _run_print_help
    func(*a, **kw)
  File "/home/alan/QueryClick/.virtualenvs/test/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 149, in _run_command
    cmd.run(args, opts)
  File "/home/alan/QueryClick/.virtualenvs/test/local/lib/python2.7/site-packages/scrapy/commands/crawl.py", line 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "/home/alan/QueryClick/.virtualenvs/test/local/lib/python2.7/site-packages/scrapy/crawler.py", line 162, in crawl
    crawler = self.create_crawler(crawler_or_spidercls)
  File "/home/alan/QueryClick/.virtualenvs/test/local/lib/python2.7/site-packages/scrapy/crawler.py", line 190, in create_crawler
    return self._create_crawler(crawler_or_spidercls)
  File "/home/alan/QueryClick/.virtualenvs/test/local/lib/python2.7/site-packages/scrapy/crawler.py", line 194, in _create_crawler
    spidercls = self.spider_loader.load(spidercls)
  File "/home/alan/QueryClick/.virtualenvs/test/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 43, in load
    raise KeyError("Spider not found: {}".format(spider_name))
KeyError: 'Spider not found: quotes'

我的目录结构是:

└── tutorial
    ├── scrapy.cfg
    └── tutorial
        ├── __init__.py
        ├── items.py
        ├── pipelines.py
        ├── settings.py
        └── spiders
            ├── __init__.py
            └── quote_spider.py

我还仔细检查了所有系统要求,如Scrapy文档中所述。我的团队在Ubuntu 14.04上按照与virtualenv相同的步骤和我的相同设置复制了这个问题。

如果有人能说清楚这一点,我会永远感激。

修改:添加settings.py

settings.py中唯一有效的内容是:

BOT_NAME = 'tutorial'
SPIDER_MODULES = ['tutorial.spiders']
NEWSPIDER_MODULE = 'tutorial.spiders'
ROBOTSTXT_OBEY = True

修改:分享scrapy.cfg

# Automatically created by: scrapy startproject
#
# For more information about the [deploy] section see:
# https://scrapyd.readthedocs.org/en/latest/deploy.html

[settings]
default = tutorial.settings

[deploy]
#url = http://localhost:6800/
project = tutorial

2 个答案:

答案 0 :(得分:0)

这与为调用蜘蛛的django项目设置的一些环境变量发生了冲突 - 它们被命名为SCRAPY_并且一定是冲突的。

修改:供参考:GitHub issue on undocumented environment variable(s)

答案 1 :(得分:0)

确保您的蜘蛛为.py,例如,如果是从jupyter创建的,则可能是不会被拾取的.ipynb。