Spring Batch 3.0将250+ String作为jobParameter传递的最佳方法

时间:2017-05-01 16:13:37

标签: spring spring-batch

我的rest控制器接收一个id列表,并将List的json字符串作为jobParameter发送到Spring Batch Job。

@Autowired
JobLauncher jobLauncher;

@Autowired
Job job;

RequestMapping(value="/startjob", method = RequestMethod.POST, produces = "application/json")
public @ResponseBody List<EventReports> addReportIds(@RequestBody List<Integer> reportIds) throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException {
    Logger logger = LoggerFactory.getLogger(this.getClass());
    try {
        JobParameters jobParameters = new JobParametersBuilder().addLong("time", System.currentTimeMillis())
            .addString("eventType", "Event Reports")
            .addString("reportIdsJson", reportIds.toString())
            .toJobParameters();
        jobLauncher.run(job, jobParameters);
    } catch (Exception e) {
        logger.info(e.getMessage());
    }
    System.out.println("Completed event reports batch job");
    return null;
}

My Spring Batch阅读器看起来像

@Component
@StepScope
public class Reader implements ItemReader<String> {

    private String[] messages = {"Hello World!", "Welcome to Spring Batch!"};

    @Value("#{jobParameters['time']}")
    private Long time;

    @Value("#{jobParameters['eventType']}")
    private String eventType;

    @Value("#{jobParameters['reportIdsJson']}")
    private String reportIdsJson;

    private int count=0;

    Logger logger = LoggerFactory.getLogger(this.getClass());

    @Override
    public String read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
        System.out.println("Time: " + time);
        System.out.println("Type: " + eventType);
        System.out.println("JSON: " + reportIdsJson);

        if(count < messages.length){
            return messages[count++];
        }else{
            count=0;
        }

我遇到的问题是 - 我作为jobParameter传递的JSON字符串可能非常大,我绝对超过了250个字符的限制。当我作为jobParameter传递给我的Reader的JSON字符串大于250个字符时,我得到一个看起来像这样的错误

Started event reports batch job
2017-05-01 10:50:03.662  INFO 8724 --- [nio-8081-exec-3] o.s.b.f.xml.XmlBeanDefinitionReader      : Loading XML bean def
initions from class path resource [org/springframework/jdbc/support/sql-error-codes.xml]
2017-05-01 10:50:03.788  INFO 8724 --- [nio-8081-exec-3] o.s.jdbc.support.SQLErrorCodesFactory    : SQLErrorCodes loaded
: [DB2, Derby, H2, HSQL, Informix, MS-SQL, MySQL, Oracle, PostgreSQL, Sybase, Hana]
2017-05-01 10:50:04.296  INFO 8724 --- [nio-8081-exec-3] c.u.r.s.RailAgentCollectorServiceImpl    : PreparedStatementCal
lback; SQL [INSERT into BATCH_JOB_EXECUTION_PARAMS(JOB_EXECUTION_ID, KEY_NAME, TYPE_CD, STRING_VAL, DATE_VAL, LONG_VAL,
DOUBLE_VAL, IDENTIFYING) values (?, ?, ?, ?, ?, ?, ?, ?)]; String or binary data would be truncated.; nested exception i
s com.microsoft.sqlserver.jdbc.SQLServerException: String or binary data would be truncated.
Completed event reports batch job

我已经研究过这个问题的各种方法。我现在正在尝试的是 - 不是将非常大的JSON字符串作为jobParameter传递给Spring Batch,而是将其作为String保存到临时数据库表中 - 然后使用传递给Spring Batch的其他jobParameters来查询我的temp数据库表,用于非常大的JSON字符串。因此,在开始我的Spring Batch作业之前,我必须将JSON字符串保存到临时db表中。这个解决方案对我来说似乎并不“干净” - 理想情况下,我只想将我的大型JSON字符串传递给我的批处理作业并立即开始处理。相反,在开始Spring Batch作业之前,我必须首先将非常大的JSON字符串保存到临时数据库中 - 因此这部分处理存在于Spring Batch之外。所以我在这种情况下的代码看起来像

//save list of Integers JSON to temp db table here
 try {
        JobParameters jobParameters = new JobParametersBuilder().addLong("time", System.currentTimeMillis())
            .addString("eventType", "Event Reports")
            .toJobParameters();
        jobLauncher.run(job, jobParameters);
    } catch (Exception e) {
        logger.info(e.getMessage());
    }
    System.out.println("Completed event reports batch job");
    return null;
}

用于我的休息控制器和我的阅读器

@Override
public String read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
    System.out.println("Time: " + time);
    System.out.println("Type: " + eventType);

    //use above 2 job parameters to query temp db table for large JSON string
    //pass large JSON string to my Spring Batch Processor
}

有没有更好的方法来设计它?谢谢你的任何建议。

1 个答案:

答案 0 :(得分:0)

我已经接受了@MichaelMinella的建议 - 我已经提高了批处理作业执行参数的250 varchar限制。我这样做的方式是

  • 我编辑了一个schema-sqlserver.sql脚本,我正在使用类似

    初始化spring.batch.schema

    CREATE TABLE BATCH_JOB_EXECUTION_PARAMS(     JOB_EXECUTION_ID BIGINT NOT NULL,     TYPE_CD VARCHAR(6)NOT NULL,     KEY_NAME VARCHAR(100)NOT NULL,     STRING_VAL VARCHAR(MAX)NULL,     DATE_VAL DATETIME DEFAULT NULL,     LONG_VAL BIGINT NULL,     DOUBLE_VAL DOUBLE PRECISION NULL,     IDENTIFYING CHAR(1)NOT NULL,     约束JOB_EXEC_PARAMS_FK外键(JOB_EXECUTION_ID)     引用BATCH_JOB_EXECUTION(JOB_EXECUTION_ID) );

然后我指向application.properties

中手工编辑的schema-sqlserver.sql文件
spring.batch.schema=classpath:BOOT-INF/classes/sql/schema-sqlserver.sql

我现在能够将250多个字符串作为jobParameters传递给我的Spring Batch作业。