scrapy在PyPy上抛出send_catch_log_deferred异常

时间:2012-11-09 09:27:53

标签: scrapy pypy

当我在PyPy上使用scrapy shell时,它会抛出一些Execption,那么这个错误是什么? 这是错误信息

    % /usr/local/share/pypy/scrapy shell http://www.baidu.com             
    zsh: correct 'shell' to 'shells' [nyae]? n
    2012-11-09 16:40:06+0800 [scrapy] INFO: Scrapy 0.16.1 started (bot: scrapybot)
    2012-11-09 16:40:06+0800 [scrapy] DEBUG: Enabled extensions: TelnetConsole, WebService, CloseSpider, CoreStats, SpiderState
    2012-11-09 16:40:06+0800 [scrapy] DEBUG: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, DefaultHeadersMiddleware, RedirectMiddleware, CookiesMiddleware, HttpCompressionMiddleware, ChunkedTransferMiddleware, DownloaderStats
    2012-11-09 16:40:06+0800 [scrapy] DEBUG: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddleware
    2012-11-09 16:40:06+0800 [scrapy] DEBUG: Enabled item pipelines: 
    2012-11-09 16:40:06+0800 [scrapy] ERROR: Error caught on signal handler: <bound method instance.start_listening of <scrapy.telnet.TelnetConsole instance at 0x00000001063f0bc0>>
        Traceback (most recent call last):
          File "/usr/local/Cellar/pypy/1.9/site-packages/twisted/internet/defer.py", line 1045, in _inlineCallbacks
            result = g.send(result)
          File "/usr/local/Cellar/pypy/1.9/site-packages/scrapy/core/engine.py", line 75, in start
            yield self.signals.send_catch_log_deferred(signal=signals.engine_started)
          File "/usr/local/Cellar/pypy/1.9/site-packages/scrapy/signalmanager.py", line 23, in send_catch_log_deferred
            return signal.send_catch_log_deferred(*a, **kw)
          File "/usr/local/Cellar/pypy/1.9/site-packages/scrapy/utils/signal.py", line 53, in send_catch_log_deferred
            *arguments, **named)
        --- <exception caught here> ---
          File "/usr/local/Cellar/pypy/1.9/site-packages/twisted/internet/defer.py", line 134, in maybeDeferred
            result = f(*args, **kw)
          File "/usr/local/Cellar/pypy/1.9/site-packages/scrapy/xlib/pydispatch/robustapply.py", line 47, in robustApply
            return receiver(*arguments, **named)
        exceptions.TypeError: start_listening() got 2 unexpected keyword arguments

1 个答案:

答案 0 :(得分:1)

据我所知,scrapy使用lxml。只有非常新的lxml工作(还有PyPy主干,而不是PyPy 1.9),我建议尝试一下。