使用pyinstaller将scrapy项目转换为Windows可执行文件时,出现scrapy错误。
2019-03-04 13:01:00 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.logstats.LogStats']
2019-03-04 13:01:01 [scrapy.middleware] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
'scrapy.downloadermiddlewares.retry.RetryMiddleware',
'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
'scrapy.downloadermiddlewares.stats.DownloaderStats']
2019-03-04 13:01:01 [scrapy.middleware] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
'scrapy.spidermiddlewares.referer.RefererMiddleware',
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
'scrapy.spidermiddlewares.depth.DepthMiddleware']
2019-03-04 13:01:01 [scrapy.middleware] INFO: Enabled item pipelines:
[]
2019-03-04 13:01:01 [scrapy.core.engine] INFO: Spider opened
Unhandled error in Deferred:
2019-03-04 13:01:01 [twisted] CRITICAL: Unhandled error in Deferred:
Traceback (most recent call last):
File "site-packages\scrapy\crawler.py", line 172, in crawl
File "site-packages\scrapy\crawler.py", line 176, in _crawl
File "site-packages\twisted\internet\defer.py", line 1613, in unwindGenerator
File "site-packages\twisted\internet\defer.py", line 1529, in _cancellableInlineCallbacks
--- <exception caught here> ---
File "site-packages\twisted\internet\defer.py", line 1418, in _inlineCallbacks
File "site-packages\scrapy\crawler.py", line 82, in crawl
builtins.ModuleNotFoundError: No module named '_sqlite3'
下面显示了我的系统抓取版本的详细信息
草率版本-v
Scrapy : 1.5.0
lxml : 4.3.2.0
libxml2 : 2.9.5
cssselect : 1.0.3
parsel : 1.5.1
w3lib : 1.20.0
Twisted : 18.9.0
Python : 3.7.0 (v3.7.0:1bf9cc5093, Jun 27 2018, 04:59:51) [MSC
v.1914 64 bit (AMD64)]
pyOpenSSL : 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019)
cryptography : 2.6.1
Platform : Windows-10-10.0.17134-SP0
sqlite3也可以通过python import工作。
任何人都可以在Windows中找到类似的问题吗?
感谢ad