Python多处理Pool.apply_async与共享变量(Value)

时间:2015-04-03 10:24:05

标签: python linux pool python-multiprocessing

对于我的大学项目,我正在尝试开发基于python的流量生成器。我在vmware上创建了2台CentOS机器,我使用1作为我的客户端,1作为我的服务器机器。我使用IP aliasing技术仅使用单个客户端/服务器机器来增加客户端和服务器的数量。到目前为止,我已在我的客户端计算机上创建了50个IP别名,并在我的服务器计算机上创建了10个IP别名。我还使用多处理模块从所有50个客户端到所有10个服务器同时生成流量。我还在我的服务器上开发了几个配置文件(1kb,10kb,50kb,100kb,500kb,1mb)(因为我使用的是Apache服务器,所以在/ var / www / html目录中)我正在使用urllib2向这些配置文件发送请求我的客户端机器。我使用httplib+urllib2首先绑定到任何一个源别名ip,然后使用urllib2从此ip发送请求。这里是increase my number of TCP Connections,我正在尝试使用multiprocessing.Pool.apply_async模块。但我收到此错误'运行时错误:在运行我的脚本时,只应通过继承在进程之间共享同步对象。经过一些调试后,我发现这个错误是由于使用了multiprocessing.Value引起的。但我想在我的进程之间共享一些变量,我也希望增加TCP连接数。这里可以使用哪些其他模块(multiprocessing.Value除外)来共享一些常见变量?或者这个查询还有其他解决方案吗?

'''
Traffic Generator Script:

 Here I have used IP Aliasing to create multiple clients on single vm machine. 
 Same I have done on server side to create multiple servers. I have around 50 clients and 10 servers
'''
import multiprocessing
import urllib2
import random
import myurllist    #list of all destination urls for all 10 servers
import time
import socbindtry   #script that binds various virtual/aliased client ips to the script
m=multiprocessing.Manager()
response_time=m.list()    #some shared variables
error_count=multiprocessing.Value('i',0)
def send_request3():    #function to send requests from alias client ip 1
    opener=urllib2.build_opener(socbindtry.BindableHTTPHandler3)    #bind to alias client ip1
    try:
        tstart=time.time()
        for i in range(myurllist.url):
            x=random.choice(myurllist.url[i])
            opener.open(x).read()
            print "file downloaded:",x
            response_time.append(time.time()-tstart)
    except urllib2.URLError, e:
        error_count.value=error_count.value+1
def send_request4():    #function to send requests from alias client ip 2
    opener=urllib2.build_opener(socbindtry.BindableHTTPHandler4)    #bind to alias client ip2
    try:
        tstart=time.time()
        for i in range(myurllist.url):
            x=random.choice(myurllist.url[i])
            opener.open(x).read()
            print "file downloaded:",x
            response_time.append(time.time()-tstart)
    except urllib2.URLError, e:
        error_count.value=error_count.value+1
#50 such functions are defined here for 50 clients
def func():
    pool=multiprocessing.Pool(processes=750)
    for i in range(5):
        pool.apply_async(send_request3)
        pool.apply_async(send_request4)
        pool.apply_async(send_request5)
#append 50 functions here
    pool.close()
    pool.join()
    print"All work Done..!!"
    return
start=float(time.time())
func()
end=float(time.time())-start
print end

2 个答案:

答案 0 :(得分:2)

正如错误消息所述,您无法通过pickle传递multiprocessing.Value。但是,您可以使用multiprocessing.Manager().Value

import multiprocessing
import urllib2
import random
import myurllist    #list of all destination urls for all 10 servers
import time
import socbindtry   #script that binds various virtual/aliased client ips to the script

def send_request3(response_time, error_count):    #function to send requests from alias client ip 1
    opener=urllib2.build_opener(socbindtry.BindableHTTPHandler3)    #bind to alias client ip1
    try:
        tstart=time.time()
        for i in range(myurllist.url):
            x=random.choice(myurllist.url[i])
            opener.open(x).read()
            print "file downloaded:",x
            response_time.append(time.time()-tstart)
    except urllib2.URLError, e:
        with error_count.get_lock():
            error_count.value += 1

def send_request4(response_time, error_count):    #function to send requests from alias client ip 2
    opener=urllib2.build_opener(socbindtry.BindableHTTPHandler4)    #bind to alias client ip2
    try:
        tstart=time.time()
        for i in range(myurllist.url):
            x=random.choice(myurllist.url[i])
            opener.open(x).read()
            print "file downloaded:",x
            response_time.append(time.time()-tstart)
    except urllib2.URLError, e:
        with error_count.get_lock():
            error_count.value += 1

#50 such functions are defined here for 50 clients

def func(response_time, error_count):
    pool=multiprocessing.Pool(processes=2*multiprocessing.cpu_count())
    args = (response_time, error_count)
    for i in range(5):
        pool.apply_async(send_request3, args=args)
        pool.apply_async(send_request4, args=args)
#append 50 functions here
    pool.close()
    pool.join()
    print"All work Done..!!"
    return

if __name__ == "__main__":
    m=multiprocessing.Manager()
    response_time=m.list()    #some shared variables
    error_count=m.Value('i',0)

    start=float(time.time())
    func(response_time, error_count)
    end=float(time.time())-start
    print end

其他一些注释:

  1. 使用包含750个进程的Pool并不是一个好主意。除非你使用的是具有数百个CPU内核的服务器,否则这将会压倒你的机器。它可以更快,减少对您的机器的压力,使用更少的流程。更像2 * multiprocessing.cpu_count()
  2. 作为最佳实践,您应该将需要使用的所有共享参数显式传递给子进程,而不是使用全局变量。这增加了代码在Windows上运行的可能性。
  3. 您的send_request*函数看起来几乎完全相同。为什么不只创建一个函数并使用变量来决定使用哪个socbindtry.BindableHTTPHandler?这样做可以避免代码重复 ton
  4. 您递增error_count的方式不是进程/线程安全的,并且容易受到竞争条件的影响。您需要使用锁来保护增量(正如我在上面的示例代码中所做的那样)。

答案 1 :(得分:1)

可能,因为 Python Multiprocess diff between Windows and Linux (我很认真,不知道多处理在虚拟机中是如何工作的,就像这里的情况一样。)

这可能有用;

import multiprocessing
import random
import myurllist    #list of all destination urls for all 10 servers
import time

def send_request3(response_time, error_count):    #function to send requests from alias client ip 1
    opener=urllib2.build_opener(socbindtry.BindableHTTPHandler3)    #bind to alias client ip1
    try:
        tstart=time.time()
        for i in range(myurllist.url):
            x=random.choice(myurllist.url[i])
            opener.open(x).read()
            print "file downloaded:",x
            response_time.append(time.time()-tstart)
    except urllib2.URLError, e:
        error_count.value=error_count.value+1
def send_request4(response_time, error_count):    #function to send requests from alias client ip 2
    opener=urllib2.build_opener(socbindtry.BindableHTTPHandler4)    #bind to alias client ip2
    try:
        tstart=time.time()
        for i in range(myurllist.url):
            x=random.choice(myurllist.url[i])
            opener.open(x).read()
            print "file downloaded:",x
            response_time.append(time.time()-tstart)
    except urllib2.URLError, e:
        error_count.value=error_count.value+1
#50 such functions are defined here for 50 clients
def func():
    m=multiprocessing.Manager()
    response_time=m.list()    #some shared variables
    error_count=multiprocessing.Value('i',0)

    pool=multiprocessing.Pool(processes=750)
    for i in range(5):
        pool.apply_async(send_request3, [response_time, error_count])
        pool.apply_async(send_request4, [response_time, error_count])
        # pool.apply_async(send_request5)
#append 50 functions here
    pool.close()
    pool.join()
    print"All work Done..!!"
    return


start=float(time.time())
func()
end=float(time.time())-start
print end