SimpleXMLRPCServer调用Celery任务

时间:2016-01-04 19:48:11

标签: python multithreading celery xmlrpclib

我正在尝试使用SimpleXMLRPCServer和Celery创建一个简单的RPC服务器。基本上,这个想法是远程客户端(client.py)可以通过xmlrpc.client将任务调用到服务器(server.py),其中包括注册为Celery任务的函数(runnable.py)。

问题是,当通过register_function注册RPC函数时,我可以通过其名称直接调用它,因此它将被正确执行,但不使用Celery。我想要实现的是通过client.py中的name.delay()调用它,它将由Celery执行,但不锁定服务器线程。因此,server.py应该像代理一样,允许多个客户端调用完整的函数集,如:

for task in flow:
    job = globals()[task]
    job.delay("some arg")
    while True:
        if job.ready():
            break

我尝试过使用register_instance和allow_dotted_names = True,但我遇到了一个错误:

xmlrpc.client.Fault: <Fault 1: "<class 'TypeError'>:cannot marshal <class '_thread.RLock'> objects">

这引出了我的问题 - 如果甚至可以做这样的事情

简化代码:

server.py

# ...runnable.py import
# ...rpc init
def register_tasks():
    for task in get_all_tasks():
        setattr(self, task, globals()[task])
        self.server.register_function(getattr(self, task), task)

runnable.py

app = Celery("tasks", backend="amqp", broker="amqp://")

@app.task()
def say_hello():
    return "hello there"

@app.task()
def say_goodbye():
    return "bye, bye"

def get_all_tasks():
    tasks = app.tasks
    runnable = []

    for t in tasks:
        if t.startswith("modules.runnable"):
            runnable.append(t.split(".")[-1])

    return runnable

最后,client.py

s = xmlrpc.client.ServerProxy("http://127.0.0.1:8000")
print(s.say_hello())

1 个答案:

答案 0 :(得分:0)

我想出了一个为Celery延迟函数创建一些额外包装器的想法。这些是以RPC客户端可以调用rpc.the_remote_task.delay(* args)的方式注册的。这将返回Celery作业ID,然后,客户端通过rpc.ready(job_id)询问作业是否准备就绪,并使用rpc.get(job_id)获取结果。至于现在,有一个明显的安全漏洞,因为当你知道工作ID时你可以得到结果,但仍然 - 它工作正常。

注册任务(server.py)

def register_tasks():
    for task in get_all_tasks():
        exec("""def """ + task + """_runtime_task_delay(*args):
return celery_wrapper(""" + task + """, "delay", *args)
setattr(self, task + "_delay", """ + task + """_runtime_task_delay)
            """)

        f_delay = task + "_delay"
        self.server.register_function(getattr(self, f_delay), task + ".delay")

    def job_ready(jid):
        return celery_wrapper(None, "ready", jid)

    def job_get(jid):
        return celery_wrapper(None, "get", jid)

    setattr(self, "ready", job_ready)
    setattr(self, "get", job_get)

    self.server.register_function(job_ready, "ready")
    self.server.register_function(job_get, "get")

包装器(server.py)

def celery_wrapper(task, method, *args):
    if method == "delay":
        job = task.delay(*args)
        job_id = job.id

        return job_id
    elif method == "ready":
        res = app.AsyncResult(args[0])
        return res.ready()
    elif method == "get":
        res = app.AsyncResult(args[0])
        return res.get()
    else:
        return "0"

RPC调用(client.py)

jid = s.the_remote_task.delay("arg1", "arg2")
is_running = True
while is_running:
        is_running = not s.ready(jid)

        if not is_running:
                print(s.get(jid))
        time.sleep(.01)