考虑将其中一个Bean标记为@Primary,更新使用者以接受多个Bean,或使用@Qualifier标识应为

时间:2018-07-08 17:36:47

标签: spring spring-boot spring-batch

我正在使用春季批处理启动示例。在我的项目中,到目前为止,我开发了两个批处理作业,将来将同时运行大约10个批处理作业。

我为第二个批处理作业开发了代码,当我开始运行第二个批处理作业时,我开始遇到以下错误。

我心中有以下疑问-

  1. 如何仅运行第二批进行开发和单元测试?
  2. 如何一次性运行所有批处理作业?

即将到来的错误-

org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type 'org.springframework.batch.core.Job' available: expected single matching bean but found 2: exportemployeesJob,exportOrdersJob
    at org.springframework.beans.factory.config.DependencyDescriptor.resolveNotUnique(DependencyDescriptor.java:215) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1116) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1065) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:584) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:91) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:373) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1350) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:580) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:760) ~[spring-beans-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:869) ~[spring-context-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550) ~[spring-context-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:759) [spring-boot-2.0.3.RELEASE.jar:2.0.3.RELEASE]
    at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:395) [spring-boot-2.0.3.RELEASE.jar:2.0.3.RELEASE]
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:327) [spring-boot-2.0.3.RELEASE.jar:2.0.3.RELEASE]
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:1255) [spring-boot-2.0.3.RELEASE.jar:2.0.3.RELEASE]
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:1243) [spring-boot-2.0.3.RELEASE.jar:2.0.3.RELEASE]
    at com.prateek.SpringBatchClassicDbApplication.main(SpringBatchClassicDbApplication.java:12) [classes/:na]

2018-07-08 21:45:23.396 ERROR 11600 --- [           main] o.s.b.d.LoggingFailureAnalysisReporter   : 

***************************
APPLICATION FAILED TO START
***************************

Description:

Field job in com.prateek.scheduler.EmployeesRunScheduler required a single bean, but 2 were found:
    - exportemployeesJob: defined by method 'exportemployeesJob' in class path resource [com/prateek/job/EmployeesJob.class]
    - exportOrdersJob: defined by method 'exportOrdersJob' in class path resource [com/prateek/job/OrdersJob.class]


Action:

Consider marking one of the beans as @Primary, updating the consumer to accept multiple beans, or using @Qualifier to identify the bean that should be consumed

OrdersJob.java

@Configuration
@EnableBatchProcessing
public class OrdersJob {

    @Autowired
    private JobBuilderFactory jobBuilderFactory;

    @Autowired
    private StepBuilderFactory stepBuilderFactory;

    @Autowired
    private DataSource dataSource;

    @Bean(destroyMethod="")
    public JdbcCursorItemReader<Orders> ordersReader(){
        JdbcCursorItemReader<Orders> itemReader = new JdbcCursorItemReader<>();
        itemReader.setDataSource(dataSource);
        itemReader.setSql("SELECT orderNumber, productName, msrp, priceEach "
                + "FROM products p "
                + "INNER JOIN orderdetails o "
                + "ON p.productcode = o.productcode "
                + "AND p.msrp > o.priceEach ");
                //+ "WHERE p.productcode = ? ");
        itemReader.setRowMapper(new OrdersRowMapper());
        itemReader.setIgnoreWarnings(true);

        return itemReader;
    }


    @Bean(destroyMethod="")
    public FlatFileItemWriter<Orders> ordersWriter(){
        FlatFileItemWriter<Orders> fileItemWriter = new FlatFileItemWriter<>();
        fileItemWriter.setResource(new FileSystemResource("csv/Orders.csv"));
        //fileItemWriter.setHeaderCallback(headerCallback());

        DelimitedLineAggregator<Orders> lineAggregator = new DelimitedLineAggregator<>();
        lineAggregator.setDelimiter(",");
        lineAggregator.setFieldExtractor(new PassThroughFieldExtractor<Orders>());

        fileItemWriter.setLineAggregator(lineAggregator);
        fileItemWriter.setShouldDeleteIfEmpty(true);

        return fileItemWriter;
    }


    // Step Execution
    @Bean
    public Step step1() {
        return stepBuilderFactory.get("step1")
                .<Orders, Orders>chunk(10)
                .reader(ordersReader())
                .writer(ordersWriter())
                .build();
    }

    // Job Execution
    @Bean
    public Job exportOrdersJob() {
        return jobBuilderFactory
                .get("exportOrdersJob")
                .incrementer(new RunIdIncrementer())
                .flow(step1())
                .end()
                .build();
    }
}

EmployeesJob.java

@Configuration
@EnableBatchProcessing
public class EmployeesJob {
    public static final String DATE_FORMAT = "dd-MM-yyyy-hh-mm-ssss";
    public static final DateFormat formatter = new SimpleDateFormat(DATE_FORMAT);

    @Autowired
    private JobBuilderFactory jobBuilderFactory;

    @Autowired
    private StepBuilderFactory stepBuilderFactory;

    @Autowired
    private DataSource dataSource;

    @Bean 
    public EmployeesProcessor employeesProcessor() {
        return new EmployeesProcessor();
    }

    @Bean
    public EmployeesRunScheduler scheduler() {
        return new EmployeesRunScheduler();
    }

    // This file helps to create CSV column aliases
    @Bean
    public EmployeesFlatFileWriterCallback headerCallback() {
        return new EmployeesFlatFileWriterCallback();
    }

    @Bean(destroyMethod="")
    public JdbcCursorItemReader<Employees> employeesReader(){
        JdbcCursorItemReader<Employees> itemReader = new JdbcCursorItemReader<>();
        itemReader.setDataSource(dataSource);
        itemReader.setSql("SELECT employeeNumber, lastName, firstName, extension, email, officeCode, reportsTo, jobTitle FROM employees ");
        itemReader.setRowMapper(new EmployeeRowMapper());
        // The fetch size can be controlled from the application.properties 
        //itemReader.setFetchSize(200);
        return itemReader;
    }


    @Bean(destroyMethod="")
    public FlatFileItemWriter<Employees> employeesWriter(){
        FlatFileItemWriter<Employees> fileItemWriter = new FlatFileItemWriter<>();
        //fileItemWriter.setResource(new FileSystemResource("csv/employees.csv"));
        fileItemWriter.setResource(new FileSystemResource("csv/employees-#{"+ formatter.format(new Date()) +"}.csv"));
        fileItemWriter.setHeaderCallback(headerCallback());

        BeanWrapperFieldExtractor<Employees> fieldExtractor = new BeanWrapperFieldExtractor<>();
        fieldExtractor.setNames(new String[] {"employeeNumber", "lastName", "firstName", "extension", "email", "officeCode", "reportsTo", "jobTitle"});

        DelimitedLineAggregator<Employees> lineAggregator = new DelimitedLineAggregator<>();
        lineAggregator.setDelimiter(",");
        lineAggregator.setFieldExtractor(fieldExtractor);

        fileItemWriter.setLineAggregator(lineAggregator);
        fileItemWriter.setShouldDeleteIfEmpty(true);

        return fileItemWriter;
    } 

    // Step Execution
    @Bean
    public Step step1() {
        return stepBuilderFactory.get("step1")
                .<Employees, Employees>chunk(10)
                .reader(employeesReader())
                .processor(employeesProcessor())
                .writer(employeesWriter())
                .build();
    }

    // Job Execution
    @Bean
    public Job exportemployeesJob() {
        return jobBuilderFactory
                .get("employeesJob")
                .incrementer(new RunIdIncrementer())
                .flow(step1())
                .end()
                .build();
    }
}

1 个答案:

答案 0 :(得分:0)

  

如何仅运行第二批进行开发和单元测试?

您可以将个人资料用于第二份工作,并仅在开发和单元测试中激活此个人资料

  

如何一次性运行所有批处理作业?

默认情况下,Spring Boot会在启动时按顺序执行应用程序上下文中的所有Job(更多详细信息:https://docs.spring.io/spring-boot/docs/current/reference/html/howto-batch-applications.html)。如果“一口气”意味着并行启动所有作业,则可以使用由异步JobLauncher(例如TaskExecutor)支持的ThreadPoolTaskExecutor。此处有更多详细信息:https://docs.spring.io/spring-batch/4.0.x/reference/html/job.html#configuringJobLauncher