使用最高17.x的Camel版本Synchronization.onComplete()
上activemq:queue
回调的所有调用都是在不同的线程中完成的,所以即使在 onComplete 非常缓慢的消息没有被阻止和排队。这应该是我所理解的asyncConsumer=true&defaultTaskExecutorType=ThreadPool&concurrentConsumers=2&maxConcurrentConsumers=100
配置的结果。
因此,此展示的示例输出是:
Received async reply: 2000 OK
Received async reply: 3000 OK
Received async reply: 5000 OK
Received async reply: 9000 OK
Received async reply: 10000 OK
Finished async reply: 2000 OK
Finished async reply: 3000 OK
Finished async reply: 5000 OK
Finished async reply: 9000 OK
Finished async reply: 10000 OK
所以"完成"记录毕竟是"收到"因为每个都在不同的线程中调用。所有回复都是异步接收的,处理的时间长短不会影响其他人的接收。
升级到Camel 18.x(或19.x)后,这已不再相同。现在收到回复和处理(漫长的过程)它阻止接收其他人。这是因为相同的线程用于调用 Synchroniztion.onComplete(),因此回复会排队,直到完全处理。
Received async reply: 2000 OK
Received async reply: 10000 OK
Received async reply: 9000 OK
Finished async reply: 2000 OK
Received async reply: 3000 OK
Finished async reply: 3000 OK
Finished async reply: 9000 OK
Finished async reply: 10000 OK
Received async reply: 5000 OK
Finished async reply: 5000 OK
我认为这些新属性完全配置了这个: replyToConcurrentConsumers = 2& replyToMaxConcurrentConsumers = 100 并且如果有新消息,如果所有当前线程都在处理,它将被处理并在新线程中回复现在(当然,如果没有达到最大计数,那么处理多条消息并将它们排队的线程是正常的)
也许我做错了但是如何配置路由以便得到类似于Camel 17的结果。增加 replyToConcurrentConsumers 属性它可以工作,但我认为它应该动态扩展并依赖 replyToMaxConcurrentConsumers 如果需要。
代码:
public class AsyncCallbackTest {
private static final Logger LOG = LoggerFactory.getLogger(AsyncCallbackTest.class);
public static void main(String[] args) throws Exception {
// create and setup a default Camel context
final CamelContext camelContext = new DefaultCamelContext();
setupRoutes(camelContext);
camelContext.start();
// real test
asyncCallbackTest(camelContext);
camelContext.stop();
System.exit(0);
}
private static void setupRoutes(CamelContext camelContext) throws Exception {
camelContext.addRoutes(new RouteBuilder() {
public void configure() {
from("activemq://queue:asyncTest?asyncConsumer=true&defaultTaskExecutorType=ThreadPool" +
"&concurrentConsumers=2&maxConcurrentConsumers=100")
.process(exchange -> {
final String msg = String.valueOf(exchange.getIn().getBody());
exchange.getOut().setBody(msg + " OK");
});
}
});
}
private static void asyncCallbackTest(CamelContext camelContext) throws Exception {
final int[] delays = new int[]{9000, 10000, 5000, 2000, 3000};
final CountDownLatch countDownLatch = new CountDownLatch(delays.length);
final Synchronization callback = new SynchronizationAdapter() {
@Override
public void onComplete(Exchange exchange) {
LOG.info("Received async reply: " + exchange.getOut().getBody());
final int delay = (int) exchange.getIn().getBody();
synchronized (this) {
try {
this.wait(delay);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
LOG.info("Finished async reply: " + exchange.getOut().getBody());
super.onComplete(exchange);
}
@Override
public void onDone(Exchange exchange) {
countDownLatch.countDown();
}
};
final ProducerTemplate producerTemplate = camelContext.createProducerTemplate();
for (int i = 0; i < delays.length; i++) {
final Exchange exchange = new DefaultExchange(camelContext);
exchange.getIn().setBody(delays[i]);
exchange.setPattern(ExchangePattern.InOut);
producerTemplate.asyncCallback("activemq://queue:asyncTest?" +
"replyToConcurrentConsumers=2&replyToMaxConcurrentConsumers=100",
exchange, callback);
}
countDownLatch.await();
camelContext.stop();
}
}