抓取后如何关闭json文件的写入?

时间:2018-12-04 05:21:49

标签: python scrapy

我正在通过CLI抓取1.5.1进行抓取:

scrapy crawl test -o data/20181204_test.json -t json 

我的管道非常简单,我可以在其中处理项目,处理之后,我想将其放入close_spider方法内的zip归档文件中:

class BidPipeline(object):
    def process_item(self, item, spider):
        return item
    def close_spider(self, spider):
        # trying to close the writing of the file
        self.exporter.finish_exporting()
        self.file.close()
        # zip the img and json files into an archive
        cleanup('test')

清理方法:

def cleanup(name):
    # create zip archive with all images inside
    filename = '../zip/' + datetime.datetime.now().strftime ("%Y%m%d-%H%M") + '_' + name
    imagefolder = 'full'
    imagepath = '/Users/user/test_crawl/bid/images'
    shutil.make_archive(
        filename, 
        'zip', 
        imagepath,
        imagefolder
    ) 
    # delete images
    shutil.rmtree(imagepath+ '/' + imagefolder)

    # add csv file to  zip archive
    filename_zip = filename + '.zip'
    zip = zipfile.ZipFile(filename_zip,'a') 
    path_to_file = '/Users/user/test_crawl/bid/data/'+  datetime.datetime.now().strftime ("%Y%m%d") + '_' + name + '.json'
    zip.write(path_to_file, os.path.basename(path_to_file)) 
    zip.close()

使用self.file.close()后的回溯:

AttributeError: 'BidPipeline' object has no attribute 'exporter'
2018-12-04 06:03:48 [scrapy.extensions.feedexport] INFO: Stored json feed (173 items) in: data/20181204_test.json

如果没有file.close,则没有回溯错误,起初它可以确定,但是json文件被截断了。

从zip存档中解压缩的文件的结尾以及从scrapy输出的json文件:

..a46.jpg"]},

scrapy输出的json文件:

a46.jpg"]}]

如何关闭文件的写入以压缩文件?

1 个答案:

答案 0 :(得分:1)

尝试删除此行self.exporter.finish_exporting()

您的对象没有exporter属性。