来自json的脚本输出的scrapy

时间:2014-05-09 22:02:11

标签: python json web-scraping scrapy scrapy-spider

我在python脚本中运行scrapy

def setup_crawler(domain):
    dispatcher.connect(stop_reactor, signal=signals.spider_closed)
    spider = ArgosSpider(domain=domain)
    settings = get_project_settings()
    crawler = Crawler(settings)
    crawler.configure()
    crawler.crawl(spider)
    crawler.start()
    reactor.run()

它成功运行并停止但结果在哪里?我希望结果采用json格式,我该怎么做?

result = responseInJSON

就像我们使用命令

一样
scrapy crawl argos -o result.json -t json

4 个答案:

答案 0 :(得分:23)

您需要手动设置FEED_FORMATFEED_URI设置:

settings.overrides['FEED_FORMAT'] = 'json'
settings.overrides['FEED_URI'] = 'result.json'

如果要将结果输入变量,可以定义一个Pipeline类,将类收集到列表中。使用spider_closed信号处理程序查看结果:

import json

from twisted.internet import reactor
from scrapy.crawler import Crawler
from scrapy import log, signals
from scrapy.utils.project import get_project_settings


class MyPipeline(object):
    def process_item(self, item, spider):
        results.append(dict(item))

results = []
def spider_closed(spider):
    print results

# set up spider    
spider = TestSpider(domain='mydomain.org')

# set up settings
settings = get_project_settings()
settings.overrides['ITEM_PIPELINES'] = {'__main__.MyPipeline': 1}

# set up crawler
crawler = Crawler(settings)
crawler.signals.connect(spider_closed, signal=signals.spider_closed)
crawler.configure()
crawler.crawl(spider)

# start crawling
crawler.start()
log.start()
reactor.run() 

仅供参考,请看Scrapy parses command-line arguments

另见:Capturing stdout within the same process in Python

答案 1 :(得分:13)

我设法通过将FEED_FORMATFEED_URI添加到CrawlerProcess构造函数,使用基本的Scrapy API教程代码,使其工作正常,如下所示:

process = CrawlerProcess({
'USER_AGENT': 'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)',
'FEED_FORMAT': 'json',
'FEED_URI': 'result.json'
})

答案 2 :(得分:4)

容易!

from scrapy import cmdline

cmdline.execute("scrapy crawl argos -o result.json -t json".split())

将该脚本放在scrapy.cfg

的位置

答案 3 :(得分:0)

settings.overrides 

似乎不再起作用,它必须被弃用。现在,传递这些设置的正确方法是使用 set 方法修改项目设置:

from scrapy.utils.project import get_project_settings
settings = get_project_settings()
settings.set('FEED_FORMAT', 'json')
settings.set('FEED_URI', 'result.json')