在搜索包含数百万个文件的文件夹时,我正在处理内存问题。有谁知道如何克服这种情况?有没有办法限制glob搜索的文件?那么它可以在块中执行吗?
Traceback (most recent call last):
File "./lb2_lmanager", line 533, in <module>
main(sys.argv[1:])
File "./lb2_lmanager", line 318, in main
matched = match_files(policy.directory, policy.file_patterns)
File "./lb2_lmanager", line 32, in wrapper
res = func(*args, **kwargs)
File "./lb2_lmanager", line 380, in match_files
listing = glob.glob(directory)
File "/usr/lib/python2.6/glob.py", line 16, in glob
return list(iglob(pathname))
File "/usr/lib/python2.6/glob.py", line 43, in iglob
yield os.path.join(dirname, name)
File "/usr/lib/python2.6/posixpath.py", line 70, in join
path += '/' + b
MemoryError
答案 0 :(得分:1)
尝试使用generators
代替lists
。
要了解读取的生成器this
import glob
dir_list = glob.iglob(YOUR_DIRECTORY)
for file in dir_list:
print file
将YOUR_DIRECTORY
更改为您要列出的目录。