运行我的第一个scrapy项目时出错

时间:2016-12-13 22:45:08

标签: python web scrapy pip

你好,现在试着整天一整天。它终于奏效了。 但是因为我对Python完全不熟悉。我认为我没有正确安装它。

我试图从scrapy.org手册(第9页)开始我的第一个scrapy项目 但是当我尝试运行项目时出错了。

这是我遇到的错误:

[root@vnode01 sproject]# scrapy runspider quotes_spider.py -o quotes.json
2016-12-13 01:33:44 [scrapy] INFO: Scrapy 1.2.2 started (bot: scrapybot)
2016-12-13 01:33:44 [scrapy] INFO: Overridden settings: {'FEED_URI': 'quotes.json', 'FEED_FORMAT': 'json'}
2016-12-13 01:33:45 [scrapy] INFO: Enabled extensions:
['scrapy.extensions.feedexport.FeedExporter',
 'scrapy.extensions.logstats.LogStats',
 'scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.telnet.TelnetConsole']
2016-12-13 01:33:45 [scrapy] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
 'scrapy.downloadermiddlewares.retry.RetryMiddleware',
 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
 'scrapy.downloadermiddlewares.chunked.ChunkedTransferMiddleware',
 'scrapy.downloadermiddlewares.stats.DownloaderStats']
2016-12-13 01:33:45 [scrapy] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
 'scrapy.spidermiddlewares.referer.RefererMiddleware',
 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
 'scrapy.spidermiddlewares.depth.DepthMiddleware']
2016-12-13 01:33:45 [scrapy] INFO: Enabled item pipelines:
[]
2016-12-13 01:33:45 [scrapy] INFO: Spider opened
Unhandled error in Deferred:
2016-12-13 01:33:45 [twisted] CRITICAL: Unhandled error in Deferred:

2016-12-13 01:33:45 [twisted] CRITICAL:
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
    result = g.send(result)
  File "/usr/local/lib/python3.5/site-packages/scrapy/crawler.py", line 74, in crawl
    yield self.engine.open_spider(self.spider, start_requests)
ImportError: No module named '_sqlite3'

我正在使用的python版本是:

[root]# cd ~
[root]# python -V
Python 3.5.2
[root]# pip -V
pip 9.0.1 from /usr/local/lib/python3.5/site-packages (python 3.5)

谢谢。任何帮助表示赞赏。

1 个答案:

答案 0 :(得分:0)

对我来说看起来不像Scrapy问题。您可能需要显式安装libsqlite3-dev(假设您使用的是基于Debian的系统)或sqlite-devel(如果您使用的是Redhat系列)