我正在使用Django-dynamic-spider库来构建动态剪贴板。应用范围是废弃网站并动态地将数据保存到我的数据库。从我将它呈现给我的Django-app,自动呈现上下文。但是当我用命令
运行它时 2016-05-08 12:56:06 [django.db.backends] DEBUG: (0.000) QUERY = u'SELECT "dynami
c_scraper_scrapedobjclass"."id", "dynamic_scraper_scrapedobjclass"."name", "dyna
mic_scraper_scrapedobjclass"."scraper_scheduler_conf", "dynamic_scraper_scrapedo
bjclass"."checker_scheduler_conf", "dynamic_scraper_scrapedobjclass"."comments"
FROM "dynamic_scraper_scrapedobjclass" WHERE "dynamic_scraper_scrapedobjclass"."
id" = %s' - PARAMS = (1,); args=(1,)
2016-05-08 12:56:06 [root] INFO: Spider for NewsWebsite "Wikinews" (1) initializ
ed.
Unhandled error in Deferred:
2016-05-08 12:56:06 [twisted] CRITICAL: Unhandled error in Deferred:
Traceback (most recent call last):
File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\cmdline.py", line 150
, in _run_command
cmd.run(args, opts)
File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\commands\crawl.py", l
ine 57, in run
self.crawler_process.crawl(spname, **opts.spargs)
File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\crawler.py", line 153
, in crawl
d = crawler.crawl(*args, **kwargs)
File "C:\Users\shazia\testscrapy\lib\site-packages\twisted\internet\defer.py",
line 1274, in unwindGenerator
return _inlineCallbacks(None, gen, Deferred())
--- <exception caught here> ---
File "C:\Users\shazia\testscrapy\lib\site-packages\twisted\internet\defer.py",
line 1128, in _inlineCallbacks
result = g.send(result)
File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\crawler.py", line 71,
in crawl
self.engine = self._create_engine()
File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\crawler.py", line 83,
in _create_engine
return ExecutionEngine(self, lambda _: self.stop())
File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\core\engine.py", line
66, in __init__
self.downloader = downloader_cls(crawler)
File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\core\downloader\__ini
t__.py", line 65, in __init__
self.handlers = DownloadHandlers(crawler)
File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\core\downloader\handl
ers\__init__.py", line 23, in __init__
cls = load_object(clspath)
File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\utils\misc.py", line
44, in load_object
mod = import_module(module)
File "C:\Python27\Lib\importlib\__init__.py", line 37, in import_module
__import__(name)
File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\core\downloader\handl
ers\s3.py", line 6, in <module>
from .http import HTTPDownloadHandler
File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\core\downloader\handl
ers\http.py", line 5, in <module>
from .http11 import HTTP11DownloadHandler as HTTPDownloadHandler
File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\core\downloader\handl
ers\http11.py", line 15, in <module>
from scrapy.xlib.tx import Agent, ProxyAgent, ResponseDone, \
File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\xlib\tx\__init__.py",
line 3, in <module>
from twisted.web import client
File "C:\Users\shazia\testscrapy\lib\site-packages\twisted\web\client.py", line
41, in <module>
from twisted.internet.endpoints import TCP4ClientEndpoint, SSL4ClientEndpoin
t
File "C:\Users\shazia\testscrapy\lib\site-packages\twisted\internet\endpoints.p
y", line 34, in <module>
from twisted.internet.stdio import StandardIO, PipeAddress
File "C:\Users\shazia\testscrapy\lib\site-packages\twisted\internet\stdio.py",
line 30, in <module>
from twisted.internet import _win32stdio
File "C:\Users\shazia\testscrapy\lib\site-packages\twisted\internet\_win32stdio
.py", line 7, in <module>
import win32api
exceptions.ImportError: No module named win32api
2016-05-08 12:56:06 [twisted] CRITICAL:
我收到以下错误:
{{1}}
我正在使用this来测试完整的代码参考。请指教。
答案 0 :(得分:1)
当我从sourceforge尝试pywin32时,它发生在我身上。请从此link下载该软件包。确保您的点子升级到最新版本。您可以像“easy_install --upgrade pip”或“pip install --upgrade pip”一样升级它。然后从给定位置重新安装pywin32。它有希望对你有用。如果您仍有问题,请告诉我。