我们正在开发一个弹簧批处理并将HSQL数据库用于内存中的作业存储库。我们不想管理作业执行历史记录,因此在每次运行中我们都会删除并重新创建JobRepostiory表。到目前为止,工作正常,没有任何问题。
我的要求是我想确保只有一个作业实例正在运行。如果作业被触发两次,它将拒绝第二个实例。
工作环境配置:
<bean id="propertyUtil" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer" >
<property name="ignoreUnresolvablePlaceholders" value="true" />
<property name="locations">
<list>
<value>classpath:batch.properties</value>
</list>
</property>
<property name="properties">
<map>
<entry key="environment" value="#{systemProperties.environment}" />
</map>
</property>
</bean>
<bean id="jobRepository" class="org.springframework.batch.core.repository.support.JobRepositoryFactoryBean">
<property name="dataSource" ref="repDataSource" />
<property name="transactionManager" ref="repTransactionManager" />
<property name="databaseType" value="hsql" />
</bean>
<bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository" />
</bean>
<bean id="jobExplorer" class="org.springframework.batch.core.explore.support.JobExplorerFactoryBean">
<property name="dataSource" ref="repDataSource"/>
</bean>
<bean id="repDataSource" class="org.apache.commons.dbcp.BasicDataSource" lazy-init="true" destroy-method="close">
<property name="driverClassName" value="org.hsqldb.jdbcDriver" />
<property name="url" value="jdbc:hsqldb:file:${${environment}_path}/batchcore.db;shutdown=true;" />
<property name="initialSize" value="5"/>
<property name="maxActive" value="10"/>
<property name="minIdle" value="1"/>
<property name="maxWait" value="10000"/>
<property name="minEvictableIdleTimeMillis" value="1800000"/>
<property name="timeBetweenEvictionRunsMillis" value="60000"/>
<property name="validationQuery" value="SELECT 1 FROM INFORMATION_SCHEMA.SYSTEM_USERS"/>
<property name="testOnBorrow" value="true"/>
<property name="testOnReturn" value="false"/>
<property name="testWhileIdle" value="false"/>
</bean>
<!-- Create meta-tables -->
<jdbc:initialize-database data-source="repDataSource" >
<jdbc:script location="org/springframework/batch/core/schema-drop-hsqldb.sql" />
<jdbc:script location="org/springframework/batch/core/schema-hsqldb.sql" />
</jdbc:initialize-database>
<bean id="repTransactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="repDataSource"/>
</bean>
我尝试扩展 JobExecutionListenerSupport 并覆盖 beforeJob 方法
public class JobExecutionListenerMonitor extends JobExecutionListenerSupport {
private JobExecution activeExecution;
public void beforeJob(JobExecution jobExecution) {
synchronized(jobExecution) {
if(activeExecution != null && activeExecution.isRunning()) {
jobExecution.stop();
} else {
activeExecution = jobExecution;
}
}
}
public void afterJob(JobExecution jobExecution) {
synchronized(jobExecution) {
if(jobExecution == activeExecution) {
activeExecution = null;
}
}
}}
现在的问题是,当我运行第二个作业实例时,测试作业在bean创建阶段失败并抛出以下异常并且没有达到侦听器代码。
分析异常我发现HSQL db文件被第一个实例锁定,因此当第二个实例尝试创建数据源时,它无法获取db文件。
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.jdbc.datasource.init.DataSourceInitializer#0': Invocation of init method failed; nested exception is org.springframework.dao.DataAccessResourceFailureException: Failed to execute database script; nested exception is org.springframework.jdbc.CannotGetJdbcConnectionException: Could not get JDBC Connection; nested exception is org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Database lock acquisition failure: lockFile: org.hsqldb.persist.LockFile@7536ad47[file =D:\DART_Activity\2016\smrs\hsqldb\batchcore.db.lck, exists=true, locked=false, valid=false, ] method: checkHeartbeat read: 2016-06-01 11:11:25 heartbeat - read: -9445 ms.)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1482)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:521)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:458)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:295)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:223)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:292)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:194)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:626)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:932)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:479)
at org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:139)
at org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:93)
at com.metlife.winweb.launcher.TriggetJob.run(TriggetJob.java:32)
at com.metlife.winweb.launcher.TriggetJob.main(TriggetJob.java:21)
Caused by: org.hsqldb.HsqlException: Database lock acquisition failure: lockFile: org.hsqldb.persist.LockFile@7536ad47[file =D:\hsqldb\batchcore.db.lck, exists=true, locked=false, valid=false, ] method: checkHeartbeat read: 2016-06-01 11:11:25 heartbeat - read: -9445 ms.
at org.hsqldb.error.Error.error(Unknown Source)
at org.hsqldb.error.Error.error(Unknown Source)
at org.hsqldb.persist.LockFile.newLockFileLock(Unknown Source)
at org.hsqldb.persist.Logger.acquireLock(Unknown Source)
at org.hsqldb.persist.Logger.open(Unknown Source)
at org.hsqldb.Database.reopen(Unknown Source)
at org.hsqldb.Database.open(Unknown Source)
at org.hsqldb.DatabaseManager.getDatabase(Unknown Source)
at org.hsqldb.DatabaseManager.newSession(Unknown Source)
专家请帮我解决这个问题。