上下文
在自动测试案例中,我下载了一个包含
的文件夹以下列方式散列这些文件中的每一个:
def hash_directory(path):
hashed_files = {}
if os.path.isfile(path):
md5 = hashlib.md5()
with open(path, 'rb') as f:
while True:
buf = f.read(65536)
if not buf : break
md5.update(buf)
return {os.path.basename(path):md5.hexdigest()}
else:
for dir_name, dir_names, file_names in os.walk(path):
for filename in file_names:
md5 = hashlib.md5()
file_path = os.path.join(dir_name.replace(path, '').strip('\\'),filename)
with open(os.path.join(dir_name, filename), 'rb') as f:
part = 0
while True:
buf = f.read(65536)
if not buf: break
md5.update(buf)
#if the md5 object gets larger than a quarter MB, then digest a part of the file.
if sys.getsizeof(md5)>262144:
hashed_files[file_path+'part'+str(part)] = md5.hexdigest()
part+=1
hashed_files[file_path] = md5.hexdigest()
return hashed_files
我对我网络上的服务器上保存的文件夹(这是"期望")执行相同的操作,并比较hashed_files
以确定下载的文件是否丢失文件夹,或者任何文件已损坏。
问题
我明白了:
Traceback (most recent call last):
File "c:\python27\lib\runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "c:\python27\lib\runpy.py", line 72, in _run_code
exec code in run_globals
File "c:\python27\lib\site-packages\robot\run.py", line 550, in <module>
run_cli(sys.argv[1:])
File "c:\python27\lib\site-packages\robot\run.py", line 489, in run_cli
return RobotFramework().execute_cli(arguments, exit=exit)
File "c:\python27\lib\site-packages\robot\utils\application.py", line 46, in execute_cli
rc = self._execute(arguments, options)
File "c:\python27\lib\site-packages\robot\utils\application.py", line 90, in _execute
error, details = get_error_details(exclude_robot_traces=False)
File "c:\python27\lib\site-packages\robot\utils\error.py", line 47, in get_error_details
details = ErrorDetails(exclude_robot_traces=exclude_robot_traces)
File "c:\python27\lib\site-packages\robot\utils\error.py", line 60, in ErrorDetails
raise exc_value
MemoryError
我们使用机器人框架编写测试用例python27,这在我的win10 16gb机器和运行代码的远程服务器上都会发生。
有人会嗅到这可能是什么吗? 有没有人知道在运行代码时监视内存的python工具?