Spring批量上传CSV文件并相应地插入数据库

时间:2017-11-22 19:53:28

标签: java spring spring-batch

我的项目有这个要求,用户上传一个必须推送到mysql数据库的CSV文件。我知道我们可以使用Spring批处理大量记录。但是我无法找到我的这个要求的任何教程/示例代码。我遇到的所有教程都只是硬编码了CSV文件名,如下所示:

https://spring.io/guides/gs/batch-processing/

我需要使用用户上传的文件并进行相应的处理。这里的任何帮助将不胜感激..

如果没有使用Spring批处理,是否有其他方法可以将上传的CSV数据插入mysql?

2 个答案:

答案 0 :(得分:3)

请将此作为主要参考:http://walkingtechie.blogspot.co.uk/2017/03/spring-batch-csv-file-to-mysql.html 这解释了如何使用Batch将CSV文件导入MySQL数据库。

但是,正如您所说,所有示例都假定硬编码文件不是您想要的。

在下面的代码中,重要位(与我提供的链接中的示例不同)是控制器,它采用多部分文件并将其保存在临时文件夹中。 然后将文件名作为参数传递给Job:

JobExecution jobExecution = jobLauncher.run(importUserJob, new JobParametersBuilder()
                .addString("fullPathFileName", fileToImport.getAbsolutePath())
                .toJobParameters());

最后,importReader使用param fullPathFileName加载用户上传的文件:

      @Bean
      public FlatFileItemReader<Person> importReader(@Value("#{jobParameters[fullPathFileName]}") String pathToFile) {
        FlatFileItemReader<Person> reader = new FlatFileItemReader<>();
        reader.setResource(new FileSystemResource(pathToFile));

这里给出了一个完整的代码(未经测试,但它有大部分组件):

@Configuration
@EnableBatchProcessing
public class BatchConfig{

    @Bean
    public ResourcelessTransactionManager batchTransactionManager(){
        ResourcelessTransactionManager transactionManager = new ResourcelessTransactionManager();
        return transactionManager;
    }

    @Bean
    protected JobRepository jobRepository(ResourcelessTransactionManager batchTransactionManager) throws Exception{
        MapJobRepositoryFactoryBean jobRepository = new MapJobRepositoryFactoryBean();
        jobRepository.setTransactionManager(batchTransactionManager);
        return (JobRepository)jobRepository.getObject();
    }

    @Bean
    public JobLauncher jobLauncher(JobRepository jobRepository) throws Exception {
        SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
        jobLauncher.setJobRepository(jobRepository);
        return jobLauncher;
    }

}

@Configuration
public class ImportJobConfig {

    @Bean
    public FlatFileItemReader<Person> importReader(@Value("#{jobParameters[fullPathFileName]}") String pathToFile) {
        FlatFileItemReader<Person> reader = new FlatFileItemReader<>();
        reader.setResource(new FileSystemResource(pathToFile));
        reader.setLineMapper(new DefaultLineMapper<Person>() {{
            setLineTokenizer(new DelimitedLineTokenizer() {{
                setNames(new String[]{"firstName", "lastName"});
            }});
            setFieldSetMapper(new BeanWrapperFieldSetMapper<Person>() {{
                setTargetType(Person.class);
            }});
        }});
        return reader;
    }

    @Bean
    public PersonItemProcessor processor() {
        return new PersonItemProcessor();
    }

    @Bean
    public JdbcBatchItemWriter<Person> writer() {
        JdbcBatchItemWriter<Person> writer = new JdbcBatchItemWriter<>();
        writer.setItemSqlParameterSourceProvider(
                new BeanPropertyItemSqlParameterSourceProvider<Person>());
        writer.setSql("INSERT INTO people (first_name, last_name) VALUES (:firstName, :lastName)");
        writer.setDataSource(dataSource);
        return writer;
    }
    // end::readerwriterprocessor[]

    // tag::jobstep[]
    @Bean
    public Job importUserJob(JobCompletionNotificationListener listener) {
        return jobBuilderFactory.get("importUserJob").incrementer(new RunIdIncrementer())
                .listener(listener).flow(step1()).end().build();
    }

    @Bean
    public Step step1(@Qualifier("importReader") ItemReader<Person> importReader) {
        return stepBuilderFactory.get("step1").<Person, Person>chunk(10).reader(importReader)
                .processor(processor()).writer(writer()).build();
    }

}

@RestController
public class MyImportController {

    @Autowired private JobLauncher jobLauncher;
    @Autowired private Job importUserJob;

    @RequestMapping(value="/import/file", method=RequestMethod.POST)
    public String create(@RequestParam("file") MultipartFile multipartFile) throws IOException{

        //Save multipartFile file in a temporary physical folder
        String path = new ClassPathResource("tmpuploads/").getURL().getPath();//it's assumed you have a folder called tmpuploads in the resources folder
        File fileToImport = new File(path + multipartFile.getOriginalFilename());
        OutputStream outputStream = new FileOutputStream(fileToImport);
        IOUtils.copy(multipartFile.getInputStream(), outputStream);
        outputStream.flush();
        outputStream.close();       

        //Launch the Batch Job
        JobExecution jobExecution = jobLauncher.run(importUserJob, new JobParametersBuilder()
                .addString("fullPathFileName", fileToImport.getAbsolutePath())
                .toJobParameters());        

        return "OK";
    }

}

答案 1 :(得分:0)

我是通过混合使用Spring MVC(RestController)和Spring Batch来实现的。 Spring MVC帮助将csv文件作为多部分请求上传。然后我通过将上传的CSV传递给Spring Job来异步调用Spring批处理。一旦Spring作业收到csv文件,它就会通过读取,处理和写入DB作业来进行弹簧批处理。