我不确定这是否可能......
我有一个大型Python应用程序,它会增长到一个很大的内存。我希望通过import
声明来跟踪流程增长,以尽可能减少这种情况。
我发现的最接近的是memory_profiler
的线条剖析功能。这只会描述一个" toplevel"虽然import
声明 - 我希望细分所有从属进口。我还没找到任何可以跟踪import语句的内存大小的分析器。
这不仅仅是为了优化我们自己的代码,而是因为最近的审计显示一些PyPi模块只需在try / except块中删除import
语句即可支持第三方框架。
try:
import bottle
# declare bottle support here
except:
pass
虽然我的应用程序部署在virtualenv中,但还有其他几个兄弟服务属于部署并在相同的virtualenv中运行...其中一个使用bottle
。
这种"模式"在我使用的少数库中使用,不需要/不需要的模块的开销是该应用程序的内存印记(基于手动隔离和测量它们)的适当数量。我想弄清楚哪些库要优先修补,哪些库我可以放心地忽略。
答案 0 :(得分:1)
After not having much luck, I had a wacky idea and it somewhat works.
I overrode the import
statement to calculate the current memory of a given process before and after every import. I don't think this covers every import situation, but it's a good start. I simply printed this, then copy/pasted it onto a file, and then did some quick preprocessing to turn it into a csv that tracks the index and percent growth/total of each call. that's enough for my current needs.
import os
import psutil
import __builtin__
this_process = psutil.Process(os.getpid())
realimport = __builtin__.__import__
def myimp(name, *args, **kwargs):
try:
_mem_start = this_process.get_memory_info()[0]
r = realimport(name, *args, **kwargs)
_mem_finish = this_process.get_memory_info()[0]
_mem_growth = _mem_finish - _mem_start
print "import|%s,%s,%s,%s" % (name, _mem_growth, _mem_start, _mem_finish)
return r
except:
raise
__builtin__.__import__ = myimp
There are better ways to do the above, and I still hope there are better ways to profile an app like this. For now, I've got a working solution.