我有10个文本文件,其中包含我想插入数据库的xmls。每个文件大小约为113 MB。阅读5-6个文件后我得到了#34; java.io.IOException:Stream关闭"错误。下面是我的Spring批量配置。你能否建议我如何纠正这个问题。
<bean id="gimReader"
class="org.springframework.batch.item.file.MultiResourceItemReader"
scope="step">
<property name="delegate">
<bean class="org.springframework.batch.item.file.FlatFileItemReader">
<property name="lineMapper">
<bean
class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="lineTokenizer">
<bean
class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<property name="names" value="gimDataStr" />
</bean>
</property>
<property name="fieldSetMapper">
<bean
class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper">
<property name="prototypeBeanName" value="gimDataBean" />
</bean>
</property>
</bean>
</property>
</bean>
</property>
<property name="resources" value="file:C:/Vet/XMLtype/SampleData/*.sql" />
</bean>
<bean id="gimDataBean" class="com.jpmorgan.batch.standinginstruction.model.GimDataBean" scope="prototype"/>
<bean id="jdbcWriter" class="org.springframework.batch.item.database.JdbcBatchItemWriter">
<property name="itemSqlParameterSourceProvider">
<bean class="org.springframework.batch.item.database.BeanPropertyItemSqlParameterSourceProvider" />
</property>
<property name="sql" value="INSERT INTO GIMDATA (GIM_DATA) VALUES(:gimDataStr)" />
<property name="dataSource" ref="dataSource" />
</bean>
<batch:job id="gimId">
<batch:step id="import">
<batch:tasklet task-executor="simpleTaskExecutor" throttle-limit="20">
<batch:chunk reader="gimReader" writer="jdbcWriter"
commit-interval="500">
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
<bean id="simpleTaskExecutor" class="org.springframework.core.task.SimpleAsyncTaskExecutor">
<property name="concurrencyLimit" value="20" />
</bean>
我在这里做错了吗?我是否必须在此处使用分区将所有文件中的数据插入数据库?
以下是我收到的完整错误。
at org.springframework.batch.item.file.FlatFileItemReader.readLine(FlatFileItemReader.java:219)
at org.springframework.batch.item.file.FlatFileItemReader.doRead(FlatFileItemReader.java:172)
at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.read(AbstractItemCountingItemStreamItemReader.java:85)
at org.springframework.batch.item.file.MultiResourceItemReader.readNextItem(MultiResourceItemReader.java:119)
at org.springframework.batch.item.file.MultiResourceItemReader.read(MultiResourceItemReader.java:108)
at sun.reflect.GeneratedMethodAccessor20.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:309)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at org.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:131)
at org.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:119)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202)
at $Proxy25.read(Unknown Source)
at org.springframework.batch.core.step.item.SimpleChunkProvider.doRead(SimpleChunkProvider.java:90)
at org.springframework.batch.core.step.item.SimpleChunkProvider.read(SimpleChunkProvider.java:148)
at org.springframework.batch.core.step.item.SimpleChunkProvider$1.doInIteration(SimpleChunkProvider.java:108)
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:367)
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:214)
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:143)
at org.springframework.batch.core.step.item.SimpleChunkProvider.provide(SimpleChunkProvider.java:103)
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:68)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:386)
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:130)
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:264)
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:76)
at org.springframework.batch.repeat.support.TaskExecutorRepeatTemplate$ExecutingRunnable.run(TaskExecutorRepeatTemplate.java:258)
at org.springframework.core.task.SimpleAsyncTaskExecutor$ConcurrencyThrottlingRunnable.run(SimpleAsyncTaskExecutor.java:229)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.io.IOException: Stream closed
at java.io.BufferedReader.ensureOpen(BufferedReader.java:97)
at java.io.BufferedReader.readLine(BufferedReader.java:292)
由于 SACH
答案 0 :(得分:0)
您正在使用multi-threaded step与有状态(1)阅读器(多资源管理器),对于您的用例,您可以使用partitioning并使用multiresourcepartitioner
1)从春季批次多线程步骤:
使用多线程步骤存在一些实际限制 一些常见的批处理用例。步骤中的许多参与者(例如读者 和作家)是有状态的,如果国家没有被隔离 线程,然后这些组件在多线程步骤中不可用。 特别是大多数现成的读者和作家 Spring Batch不是为多线程使用而设计的。