python多处理脚本的内存使用情况

时间:2018-08-07 21:08:06

标签: python multithreading memory

我想测量python脚本的总最大内存使用情况,该脚本是多处理的,并使用ProcessPoolExecution模块。我怎样才能正确地做到这一点?现在我的测试skript报告了错误的值,内存使用仅对主进程响应。单线程和脚本的多进程变体之间的差别很小。

import requests
import concurrent.futures
from concurrent.futures import ProcessPoolExecutor


def load_url(url):
    r = requests.get(url)
    return r.text


if __name__ == '__main__':

    url = "https://www.httpbin.org/delay/0.1"
    URLS = [url for x in range(500)]

    with ProcessPoolExecutor(max_workers=32) as executor:
        futures = {executor.submit(load_url, url): url for url in URLS}

        for future in concurrent.futures.as_completed(futures):
            url = futures[future]

            try:
                data = future.result()
            except Exception as e:
                print(f"{url} generated an exception: {e}")
            else:
                pass

    process = psutil.Process()
    print(process.memory_info().rss)

0 个答案:

没有答案