每个起始网址抓取输出一个CSV文件

时间:2019-11-10 12:37:37

标签: python scrapy

我想为每个start_url输出1个CSV文件。我制作了一个仅输出1个文件的管道,其中包含来自所有url的信息,但是不知道如何输出多个文件。

pipeline.py

class CSVPipeline(object):

def __init__(self):
    self.files = {}

@classmethod
def from_crawler(cls, crawler):
    pipeline = cls()
    crawler.signals.connect(pipeline.spider_opened, signals.spider_opened)
    crawler.signals.connect(pipeline.spider_closed, signals.spider_closed)
    return pipeline

def spider_opened(self, spider):
    file = open('%s_items.csv' % spider.name, 'w+b')
    self.files[spider] = file
    self.exporter = CsvItemExporter(file)
    self.exporter.fields_to_export = ['date', 'move', 'bank', 'call', 'price']
    self.exporter.start_exporting()

def spider_closed(self, spider):
    self.exporter.finish_exporting()
    file = self.files.pop(spider)
    file.close()

    print('Starting csv blank line cleaning')
    with open('%s_items.csv' % spider.name, 'r') as f:
        reader = csv.reader(f)
        original_list = list(reader)
        cleaned_list = list(filter(None,original_list))

    with open('%s_items_cleaned.csv' % spider.name, 'w', newline='') as output_file:
        wr = csv.writer(output_file, dialect='excel')
        for data in cleaned_list:
            wr.writerow(data)

def process_item(self, item, spider):
    self.exporter.export_item(item)
    return item


class SentimentPipeline(object):
    def process_item(self, item, spider):
        return item

我一直在跑步:

scrapy crawl spider -o spider.csv

我需要一个新命令吗? 很新,容易刮scrap。谢谢!

1 个答案:

答案 0 :(得分:0)

您需要创建以下pipeline.py文件中的CSV项目管道

class PerUrlCsvExportPipeline:

def open_spider(self, spider):
    self.url_to_exporter = {}

def close_spider(self, spider):
    for exporter in self.url_to_exporter.values():
        exporter.finish_exporting()

def _exporter_for_item(self, item):
    url = item['url']
    if url not in self.url_to_exporter:
        f = open('{}.csv'.format(your_file_name), 'wb')
        exporter = CsvItemExporter(f)
        exporter.start_exporting()
        self.url_to_exporter[url] = exporter
    return self.url_to_exporter[url]

def process_item(self, item, spider):
    exporter = self._exporter_for_item(item)
    exporter.export_item(item)
    return item

然后将管道添加到您的settings.py文件中:

    ITEM_PIPELINES = {
        'your_project_name.pipelines.PerUrlCsvExportPipeline': 300,
    }