我有一个Clickhouse数据库,我想在其中执行异步查询,每个查询都在不同的节点上。 我在https://docs.python.org/3/library/asyncio-queue.html#examples上找到了所需的示例。 我对其进行了一些修改(请参见下面的内容),并且可以使用,但是...
from aioch import Client
result_set = []
async def exec_sql(name, queue, client):
while True:
print('name =',name)
sql = await queue.get()
result_set.append(await client.execute(sql))
# Notify the queue that the "work item" has been processed.
queue.task_done()
async def main():
num_of_nodes = 10
num_of_sqls = 20
ports = range(2441, 2451)
clients = [Client(host='localhost', port=port, database='database', compression=True) for port in ports]
# Create a queue that we will use to store our "workload".
queue = asyncio.Queue()
# Generate sql's
for _ in range(num_of_sqls):
sql = 'select hostName()'
queue.put_nowait(sql)
# Create worker tasks to process the queue concurrently.
tasks = []
for i in range(num_of_nodes):
task = asyncio.create_task(exec_sql(f'worker-{i}', queue, clients[i]))
tasks.append(task)
# Wait until the queue is fully processed.
await queue.join()
# Cancel our worker tasks.
for task in tasks:
task.cancel()
# Wait until all worker tasks are cancelled.
await asyncio.gather(*tasks, return_exceptions=True)
asyncio.run(main())
print(result_set)
是否有比在开始时定义一个空数组“ result_set”更好的方法来收集每个查询的结果?
答案 0 :(得分:0)
您应该能够在那里返回等待的执行方式。这样,所有内容就可以在收集时缝合在一起了。即
async def exec_sql(name, queue, client):
while True:
print('name =',name)
sql = await queue.get()
result = await client.execute(sql)
# Notify the queue that the "work item" has been processed.
queue.task_done()
return result