当用户访问我的Flask应用中的页面时,我想增加一个计数器。如果两个用户访问该页面,计数应该增加2.我尝试了以下但是计数始终是1.如何增加每次访问的值?
@app.route('/count')
def make_count():
count = 0
value = count + 1
return jsonify(count=value)
答案 0 :(得分:9)
同时计数很难。假设计数为0.如果两个用户都以足够接近的间隔命中端点,则每个用户可以获得值0,将其增加到1,然后将其放回。两个用户访问端点,但结果计数为1,而不是2.要解决此问题,您需要使用支持原子递增的数据存储(例如,一次只能执行一个进程的操作)。 / p>
您不能使用简单的Python global
,因为WSGI服务器将产生多个进程,因此它们每个都有自己的全局独立副本。重复的请求可以由不同的进程处理,从而产生不同的,不同步的值。
最简单的解决方案是Python multiprocessing.Value
。只要在创建值后生成进程,就可以跨进程同步对共享值的访问。
from flask import Flask, jsonify
from multiprocessing import Value
counter = Value('i', 0)
app = Flask(__name__)
@app.route('/')
def index():
with counter.get_lock():
counter.value += 1
return jsonify(count=counter.value)
app.run(processes=8)
# access http://localhost:5000/ multiple times quickly, the count will be correct
还有一些警告:
对于真实场景,Redis是一个更强大的解决方案。服务器独立于Web应用程序,具有持久性选项,并且可以进行原子增量。它也可以用于应用程序的其他部分,例如缓存。
答案 1 :(得分:1)
@davidism接受的回答中有一个小问题。 multiprocessing.Value
是在锁定之外访问的,因此如果您不幸,仍有可能出现重复值。
这是一个显示碰撞的例子。它还显示了如果使用异步代码(asyncio拥有自己的锁定机制),这种冲突是如何实现的。
import asyncio
import concurrent.futures
import time
from multiprocessing import Value
# Have sleep timings that could cause value collisions outside of lock context manager
TIMINGS = [(0, 0), (1, 1), (0, 2)]
counter = Value('i', 0)
def incr_counter(pre_incr_sleep, pre_return_sleep):
time.sleep(pre_incr_sleep)
with counter.get_lock():
counter.value += 1
time.sleep(pre_return_sleep)
return counter.value
def incr_counter_context(pre_incr_sleep, pre_return_sleep):
time.sleep(pre_incr_sleep)
with counter.get_lock():
counter.value += 1
time.sleep(pre_return_sleep)
return counter.value
async def aincr_counter(pre_incr_sleep, pre_return_sleep):
"""Return outside of the locked context (This should multi increment in some scenarios)"""
await asyncio.sleep(pre_incr_sleep)
with counter.get_lock():
counter.value += 1
await asyncio.sleep(pre_return_sleep)
return counter.value
async def aincr_counter_context(pre_incr_sleep, pre_return_sleep):
"""Return outside of the locked context (This shouldn't multi increment in any scenario)"""
await asyncio.sleep(pre_incr_sleep)
with counter.get_lock():
counter.value += 1
await asyncio.sleep(pre_return_sleep)
return counter.value
print("*** Showing that multiprocessing.Value is multiprocess safe ***")
with concurrent.futures.ProcessPoolExecutor() as executor:
futures = []
print("Testing concurrent returning inside of lock...")
for timings in TIMINGS:
futures.append(executor.submit(incr_counter_context, *timings))
print("Returning value inside of lock context won't cause duplicates when using non-asyncronous executor")
print([future.result() for future in futures])
futures = []
print("Testing concurrent returning outside lock...")
for timings in TIMINGS:
futures.append(executor.submit(incr_counter, *timings))
print("Returning value outside of lock context can cause duplicate values")
print([future.result() for future in futures])
loop = asyncio.get_event_loop()
print("*** Showing that multiprocessing.Value is not async safe ***")
print("Testing async returning outside of lock...")
print(loop.run_until_complete(asyncio.gather(*[aincr_counter(pre, post) for pre, post in TIMINGS])))
print("Testing async returning inside of lock...")
print(loop.run_until_complete(asyncio.gather(*[aincr_counter_context(pre, post) for pre, post in TIMINGS])))
以上是上述的输出:
*** Showing that multiprocessing.Value is multiprocess safe ***
Testing concurrent returning inside of lock...
Returning value inside of lock context won't cause duplicates when using non-asyncronous executor
[1, 3, 2]
Testing concurrent returning outside lock...
Returning value outside of lock context can cause duplicate values
[4, 6, 6]
*** Showing that multiprocessing.Value is not async safe ***
Testing async returning outside of lock...
[8, 9, 9]
Testing async returning inside of lock...
[11, 12, 12]
幸运的是,您正在使用同步的Flask,因此异步问题不是您的用例所关注的问题。
所以,我建议更改已接受的答案,将锁存储在上下文中,然后尽快释放锁。如果你要调用jsonify或其他任何东西,你会在执行不需要它的操作时保持锁定。
@app.route('/')
def index():
with counter.get_lock():
counter.value += 1
# save the value ASAP rather than passing to jsonify
# to keep lock time short
unique_count = counter.value
return jsonify(count=unique_count)