exceptions.TypeError:无法将字典更新序列元素#1转换为序列?

时间:2015-10-22 10:51:46

标签: python web-crawler scrapy

我使用已打开的项目scrapy来抓取来自腾讯的视频评论,但出现了错误。而且我不知道如何解决这个问题。

2015-10-22 18:33:58 [scrapy] INFO: Scrapy 1.0.1 started (bot: qqtvurl)
2015-10-22 18:33:58 [scrapy] INFO: Optional features available: ssl, http11, boto
2015-10-22 18:33:58 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'qqtvurl.spiders', 'SPIDER_MODULES': ['qqtvurl.spiders'], 'SCHEDULER': 'scrapy_redis.scheduler.Scheduler', 'BOT_NAME': 'qqtvurl'}
2015-10-22 18:33:58 [scrapy] INFO: Enabled extensions: CloseSpider, TelnetConsole, LogStats, CoreStats, SpiderState
2015-10-22 18:33:58 [qqtvspider] DEBUG: Reading URLs from redis list 'qqtvspider:star_urls'
Unhandled error in Deferred:
2015-10-22 18:33:58 [twisted] CRITICAL: Unhandled error in Deferred:


Traceback (most recent call last):
     File "D:\anzhuang\Anaconda\lib\site-packages\scrapy\cmdline.py", line 150, in _run_command
cmd.run(args, opts)
     File "D:\anzhuang\Anaconda\lib\site-packages\scrapy\commands\crawl.py", line 57, in run
self.crawler_process.crawl(spname, **opts.spargs)
     File "D:\anzhuang\Anaconda\lib\site-packages\scrapy\crawler.py", line 153, in crawl
d = crawler.crawl(*args, **kwargs)
     File "D:\anzhuang\Anaconda\lib\site-packages\twisted\internet\defer.py", line 1274, in unwindGenerator
return _inlineCallbacks(None, gen, Deferred())
--- <exception caught here> ---
    File "D:\anzhuang\Anaconda\lib\site-packages\twisted\internet\defer.py", line 1128, in _inlineCallbacks
result = g.send(result)
    File "D:\anzhuang\Anaconda\lib\site-packages\scrapy\crawler.py", line 71, in crawl
self.engine = self._create_engine()
   File "D:\anzhuang\Anaconda\lib\site-packages\scrapy\crawler.py", line 83, in _create_engine
return ExecutionEngine(self, lambda _: self.stop())
  File "D:\anzhuang\Anaconda\lib\site-packages\scrapy\core\engine.py", line 66, in __init__
self.downloader = downloader_cls(crawler)
  File "D:\anzhuang\Anaconda\lib\site-packages\scrapy\core\downloader\__init__.py", line 65, in __init__
self.handlers = DownloadHandlers(crawler)
  File "D:\anzhuang\Anaconda\lib\site-packages\scrapy\core\downloader\handlers\__init__.py", line 17, in __init__
handlers.update(crawler.settings.get('DOWNLOAD_HANDLERS', {}))
exceptions.TypeError: cannot convert dictionary update sequence element #1 to a sequence
2015-10-22 18:33:58 [twisted] CRITICAL:

我在setting.py中添加以下代码

DOWNLOAD_HANDLERS = {'S3', None,}

当我运行项目时,出现了上述错误。 非常感谢!!!

2 个答案:

答案 0 :(得分:1)

那是因为您正在将序列元素设置为字典。

您应该输入:

DOWNLOAD_HANDLERS = {'S3': None,}

或类似的东西。

您可以在此处详细了解如何使用示例设置DOWNLOAD_HANDLERS的值:http://doc.scrapy.org/en/latest/topics/settings.html#download-handlers-base

答案 1 :(得分:1)

{'S3', None,}set,而代码期望DOWNLOAD_HANDLERSdict或序列(键,值)元组。

IOW将{'S3', None,}替换为{'S3': None},您不应该出现此错误。