Python:下载大文件时出现不可预知的内存错误

时间:2011-03-30 21:40:41

标签: python download out-of-memory

我编写了一个python脚本,用于从HTTP服务器下载大量视频文件(每个50-400 MB)。到目前为止,它在很长的下载列表中运行良好,但由于某种原因,它很少出现内存错误。

该机器有大约1 GB的RAM空闲,但我不认为它在运行此脚本时会在RAM上达到最大值。

我已经监控了任务管理器和perfmon中的内存使用情况,它总是与我看到的相同:在下载过程中缓慢增加,然后在完成下载后恢复到正常水平(没有小漏洞)爬起来或类似的东西)。

下载行为的方式是它创建文件,在下载完成之前保持0 KB(或程序崩溃),然后立即写入整个文件并关闭它。

for i in range(len(urls)):
    if os.path.exists(folderName + '/' + filenames[i] + '.mov'):
        print 'File exists, continuing.'
        continue

    # Request the download page
    req = urllib2.Request(urls[i], headers = headers)

    sock = urllib2.urlopen(req)
    responseHeaders = sock.headers
    body = sock.read()
    sock.close()

    # Search the page for the download URL
    tmp = body.find('/getfile/')
    downloadSuffix = body[tmp:body.find('"', tmp)]
    downloadUrl = domain + downloadSuffix

    req = urllib2.Request(downloadUrl, headers = headers)

    print '%s Downloading %s, file %i of %i'
        % (time.ctime(), filenames[i], i+1, len(urls))

    f = urllib2.urlopen(req)

    # Open our local file for writing, 'b' for binary file mode
    video_file = open(foldername + '/' + filenames[i] + '.mov', 'wb')

    # Write the downloaded data to the local file
    video_file.write(f.read()) ##### MemoryError: out of memory #####
    video_file.close()

    print '%s Download complete!' % (time.ctime())

    # Free up memory, in hopes of preventing memory errors
    del f
    del video_file

这是堆栈跟踪:

  File "downloadVideos.py", line 159, in <module>
    main()
  File "downloadVideos.py", line 136, in main
    video_file.write(f.read())
  File "c:\python27\lib\socket.py", line 358, in read
    buf.write(data)
MemoryError: out of memory

1 个答案:

答案 0 :(得分:9)

您的问题在于:f.read()。该行尝试将整个文件下载到内存中。而不是那样,读取块(chunk = f.read(4096)),并将碎片保存到临时文件。