I have a simple HTTP server setup like this one. It processes a slow 40 second request to open and then close gates (real metallic gates). If second HTTP query is made during execution of the first one, it is placed in queue and then executed after first run. I don't need this behavior, I need to reply with error if gate open/close procedure is in progress now. How can I do that? There's a parameter 'request_queue_size' - but I'm not sure how to set it.
答案 0 :(得分:1)
You need to follow a different strategy designing your server service. You need to keep the state of the door either in memory or in a database. Then, each time you receive a request to do something on the door, you check the current state of the door in your persistence, and then you execute the action if it is possible to do on the current state, otherwise you return an error. Also, don't forget to update the state of the door once an action completes.
答案 1 :(得分:1)
' request_queue_size'似乎没有效果。 解决方案是使服务器多线程,并实现锁定变量' busy':
<p:ajax onstart="sendOnlyEditedRow(cfg)" event="rowEdit" listener="#{cc.attrs.detectedFormations.confirmDetectedFormation(i)}"
process="@this detectedFormations" update="detectedFormations"
partialSubmit="true" />
答案 2 :(得分:0)
In general, the idea you're looking for is called request throttling. There are lots of implementations of this kind of thing which shouldn't be hard to dig up out there on the Web: here's one for Flask, my microframework of choice - https://flask-limiter.readthedocs.io/en/stable/
Quick usage example:
@app.route("/open_gate")
@limiter.limit("1 per minute")
def slow():
gate_robot.open_gate()
return