ScrapyJs(scrapy + splash)无法加载脚本,但启动服务器运行良好

时间:2017-05-11 14:40:58

标签: python scrapy web-crawler splash scrapy-splash

我正在尝试使用Scrapy(scrapyjs)来抓取带有脚本的页面,以便获得完整的加载页面。 我使用splash + scrapy来使用以下代码渲染它。 这与直接使用localhost:8050服务器的args完全相同

   script = """
    function main(splash)
      local url = splash.args.url
      assert(splash:go(url))
      assert(splash:wait(0.5))
      return {
        html = splash:html(),
        png = splash:png(),
        har = splash:har(),
      }
    end
    """

    splash_args = {
        'wait': 0.5,
        'url': response.url,
        'images': 1,
        'expand': 1,
        'timeout': 60.0,
        'lua_source': script
    }

    yield SplashRequest(response.url,
                        self.parse_list_other_page,
                        cookies=response.request.cookies,
                        args=splash_args)

响应html不包含我需要的元素,但是如果我直接在localhost:8050上使用它,那么启动服务器效果很好。

你知道问题出在哪里吗?

This is my settings.py
    SPLASH_URL = 'http://127.0.0.1:8050'
    SPIDER_MIDDLEWARES = {
        'scrapy_splash.SplashDeduplicateArgsMiddleware': 100,
    }

    # Enable or disable downloader middlewares
    # See http://scrapy.readthedocs.org/en/latest/topics/downloader-middleware.html
    DOWNLOADER_MIDDLEWARES = {
        'scrapy_splash.SplashCookiesMiddleware': 723,
        'scrapy_splash.SplashMiddleware': 725,
        # scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware': 750,
        'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware': 810,
    }

    # Crawl responsibly by identifying yourself (and your website) on the 
    user-agent
    USER_AGENT = "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) 
    AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.111 
    Safari/537.36"

    SPIDER_MIDDLEWARES = {
        'scrapy_splash.SplashDeduplicateArgsMiddleware': 100,
    }

    # Enable or disable downloader middlewares
    # See http://scrapy.readthedocs.org/en/latest/topics/downloader-middleware.html
    DOWNLOADER_MIDDLEWARES = {
        'scrapy_splash.SplashCookiesMiddleware': 723,
        'scrapy_splash.SplashMiddleware': 725,
        # scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware': 750,


'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware': 810,
}

1 个答案:

答案 0 :(得分:1)

默认端点是'render.json';要使用'lua_source'参数(即运行Lua脚本),您必须使用'execute'端点:

yield SplashRequest(response.url, endpoint='execute',
                    self.parse_list_other_page,
                    cookies=response.request.cookies,
                    args=splash_args)