如何使用Spring Integration设置ThreadPool来处理文件消息源?

时间:2018-07-03 01:47:24

标签: spring-integration spring-integration-dsl spring-dsl

有人可以帮助我使用线程池重写此流吗? 下面的代码有效,但是使用固定的延迟来处理传入文件:

@Bean
  public IntegrationFlow sampleFlow() {
    return IntegrationFlows
          .from(fileReadingMessageSource(), c -> c.poller(Pollers.fixedDelay(500)))
          .channel(new DirectChannel())
          .transform(fileMessageToJobRequest())
          .handle(springBatchJobLauncher())
          .handle(jobExecution -> {
            logger.info("jobExecution payload: {}", jobExecution.getPayload());
          })
          .get();
  }

需要线程是因为文件的传输速度很快。

2 个答案:

答案 0 :(得分:0)

可以使用以下选项配置轮询器:

    /**
 * Specify an {@link Executor} to perform the {@code pollingTask}.
 * @param taskExecutor the {@link Executor} to use.
 * @return the spec.
 */
public PollerSpec taskExecutor(Executor taskExecutor) {

您真正可以在其中提供ThreadPoolTaskExecutor实例的地方。

答案 1 :(得分:0)

感谢@Artem。 我根据Artem的答案找到了解决方案。诀窍是在下面的代码中使用TaskExecutor。另外,应该将Pollers.maxMessagesPerPoll(nbfiles)设置为一次处理多个消息(=文件)。

  @Bean
  public IntegrationFlow sampleFlow() throws InterruptedException {
    return IntegrationFlows
          .from(fileReadingMessageSource(), c -> c.poller(Pollers.fixedDelay(5000).maxMessagesPerPoll(5)))
          .channel(MessageChannels.executor(threadPoolTaskExecutor()))
          .transform(fileMessageToJobRequest())
          .handle(springBatchJobLauncher())
          .handle(jobExecution -> {
            logger.debug("jobExecution payload: {}", jobExecution.getPayload());
          })
          .get();
  }

  @Bean
  public TaskExecutor threadPoolTaskExecutor() {
    int poolSize = 20;
    logger.debug("...... createing ThreadPool of size {}.", poolSize);
    ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
    executor.setThreadNamePrefix("Dama_Thread_");
    executor.setMaxPoolSize(5);
    executor.setCorePoolSize(5);
    executor.setQueueCapacity(22);
    return executor;
  }