我正在尝试将两个dataSource与spring batch结合使用,同时使用spring boot,这两个版本都是最新版本。 目标是批处理将嵌入式数据源用于批处理内部使用,而其他数据源将用于我的逻辑过程,例如对数据库的读写。 运行时出现以下错误。
Caused by: org.springframework.beans.factory.BeanCurrentlyInCreationException: Error creating bean with name 'secondDataSource': Requested bean is currently in creation: Is there an unresolvable circular reference?
我通过按照以下注释中的@Mahmoud Ben Hassine的建议分离dataSource confige来解决此问题,但出现另一个错误:
Failed to bind properties under '' to com.zaxxer.hikari.HikariDataSource:
Property: driverclassname
Value: org.h2.Driver
Origin: "driverClassName" from property source "source"
Reason: Failed to load driver class org.h2.Driver in either of HikariConfig class loader or Thread context classloader
Action:
Update your application's configuration
有人可以看到我在做什么错吗?
预先感谢
@SpringBootApplication(exclude = { DataSourceAutoConfiguration.class, HibernateJpaAutoConfiguration.class, DataSourceTransactionManagerAutoConfiguration.class })
@EnableTransactionManagement
public class BatchApplication {
public static void main(String[] args) {
SpringApplication.run(BatchApplication.class, args);
}
}
Application.propertie
spring.datasource.url=jdbc:informix-sqli://king:2000/kong:INFORMIXSERVER=pong
spring.datasource.username=user
spring.datasource.password=pass
app.datasource.first.url=jdbc:informix-sqli://king:2000/kong:INFORMIXSERVER=pong
app.datasource.first.username=user
app.datasource.first.password=pass
app.datasource.first.driver-class-name=com.informix.jdbc.IfxDriver
app.datasource.second.url=jdbc:h2:~/test
app.datasource.second.username=sa
app.datasource.second.password=sa
app.datasource.second.driver-class-name=org.h2.Driver
DataSourceConfiguration
@Configuration
@PropertySource("classpath:application.properties")
public class BasicDataSourceConfiguration {
@Bean
// @Primary
@ConfigurationProperties("app.datasource.first")
public DataSourceProperties firstDataSourceProperties() {
return new DataSourceProperties();
}
@Bean
// @Primary
@ConfigurationProperties("app.datasource.first.configuration")
public DataSource firstDataSource() {
return firstDataSourceProperties().initializeDataSourceBuilder().build();
}
@Bean
@Primary
@ConfigurationProperties("app.datasource.second")
public DataSourceProperties secondDataSourceProperties() {
return new DataSourceProperties();
}
@Bean(name = "secondDataSource")
@Primary
@ConfigurationProperties("app.datasource.second.configuration")
public HikariDataSource secondDataSource() {
return secondDataSourceProperties().initializeDataSourceBuilder().type(HikariDataSource.class).build();
}
}
这是我的批处理配置
@Configuration
@EnableBatchProcessing
@Import(BasicDataSourceConfiguration.class)
//@ComponentScan(basePackageClasses = DefaultBatchConfigurer.class)
public class BatchConfiguration extends DefaultBatchConfigurer {
@Autowired
private PlatformTransactionManager transactionManager;
@Autowired
public JobBuilderFactory jobBuilderFactory;
@Autowired
public StepBuilderFactory stepBuilderFactory;
@Autowired
private AflRepository aflRepository;
@Bean
public ItemReader<Afl> aflReader(){
RepositoryItemReader<Afl> repositoryItemReader = new RepositoryItemReader<>();
repositoryItemReader.setRepository(aflRepository);
repositoryItemReader.setMethodName("findById");
List parameters = new ArrayList();
long a = 0;
parameters.add(a);
repositoryItemReader.setArguments(parameters);
Map<String, Sort.Direction> sort = new HashMap<String, Sort.Direction>();
sort.put("id", Sort.Direction.ASC);
repositoryItemReader.setSort(sort);
System.out.println("Indise aflReader reading from database. ");
return repositoryItemReader;
}
@Bean
public AflItemProcessor aflItemProcessor() {
return new AflItemProcessor();
}
@Bean
public ItemWriter<Afl> aflWriter(){
RepositoryItemWriter<Afl> repositoryItemWriter = new RepositoryItemWriter<>();
repositoryItemWriter.setRepository(aflRepository);
repositoryItemWriter.setMethodName("save");
System.out.println("Inside aflWriter writing to the database. ");
return repositoryItemWriter;
}
@Bean
public Step aflStep() {
System.out.println("Inside aflStep putting all together. ");
return stepBuilderFactory.get("aflStep")
.<Afl, Afl>chunk(10)
.reader(aflReader())
.processor(aflItemProcessor())
.writer(aflWriter())
.build();
}
@Bean
public Job aflJob(AflJobCompletionNotificationListener listener, Step aflStep) {
System.out.println("Inside aflJob creating job. ");
return jobBuilderFactory.get("aflJob")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(aflStep)
.end()
.build();
}
private ResourcelessTransactionManager batchTransactionManager() {
return new ResourcelessTransactionManager();
}
// @Override
// protected JobRepository createJobRepository() throws Exception {
// System.out.println("Inside createJobRepository creating custom JobRepository. ");
//
// JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
// factory.setDataSource(secondDataSource);
// factory.setDatabaseType(DatabaseType.H2.getProductName());
// factory.setTransactionManager(batchTransactionManager());
//// factory.setDataSource(informixDataSource());
//// factory.setDatabaseType(DatabaseType.ORACLE.getProductName());
// factory.afterPropertiesSet();
//
// return factory.getObject();
// }
@Override
public void setDataSource(@Qualifier("secondDataSource") DataSource dataSource) {
super.setDataSource(dataSource);
}
}