我有一个不断处理数据的服务,它接收通过消息传递处理新数据的请求。当忙于处理新请求时,将它们合并在一起,以便它们立即全部处理。 AutoResetEvent用于通知处理器新请求可用。
我的问题是在EventLoop中,WaitOne之后的currentRequest是否可能为null?
在锁定(_eventLocker)之外使用_eventAvailable.Set()是不好的做法吗?我将它移出,以便它不会开始在WaitOne上进行并立即对锁定(_eventLocker)。
有关如何更好地编写以下代码的任何建议吗?
public sealed class RealtimeRunner : MarshalByRefObject
{
/// <summary>
/// The actual event, new events get merged into this if it is not null
/// </summary>
private Request _pendingRequest;
/// <summary>
/// Used to signal the runner thread when an event is available to process
/// </summary>
private readonly AutoResetEvent _eventAvailable = new AutoResetEvent(false);
private readonly object _eventLocker = new object();
/// <summary>
/// Called on a background thread via messaging
/// </summary>
public void QueueEvent(RealtimeProcessorMessage newRequest)
{
bool mergedRequest;
lock (_eventLocker)
{
if (_pendingRequest == null)
{
mergedRequest = false;
_pendingRequest = new Request(newRequest, _engine);
}
else
{
mergedRequest = true;
_pendingRequest.Merge(newRequest, _engine);
}
}
_eventAvailable.Set();
}
/// <summary>
/// This is running on its own thread
/// </summary>
private void EventLoop()
{
while (true)
{
// Block until something exists in _pendingRequest
_eventAvailable.WaitOne();
Request currentRequest;
lock (_eventLocker)
{
currentRequest = _pendingRequest;
_pendingRequest = null;
}
// CAN THIS EVER BE NULL?
if (currentRequest == null)
continue;
//do stuff with the currentRequest here
}
}
}
答案 0 :(得分:1)
是的,if (currrentRequest == null)
可以评估为真。考虑两个线程竞相调用_eventAvailable.Set()
。一个完成呼叫,另一个被抢占。同时,EventLoop
线程唤醒并完成循环的整个迭代。您现在的情况是_pendingRequest
为空且WaitHandle
仍在等待再次发出信号。
我想提出一个完全不同的问题解决方案。看起来你的代码可以通过使用生产者 - 消费者模式来简化。使用阻塞队列最容易实现此模式。 BlockingCollection
类实现了这样的队列。
public sealed class RealtimeRunner : MarshalByRefObject
{
private BlockingCollection<Request> m_Queue = new BlockingCollection<Request>();
public void QueueEvent(RealtimeProcessorMessage newRequest)
{
m_Queue.Add(new Request(newRequest _engine));
}
private void EventLoop()
{
while (true)
{
// This blocks until an item appears in the queue.
Request request = m_Queue.Take();
// Process the request here.
}
}
}