Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
808 views
in Technique[技术] by (71.8m points)

java - Spring batch to upload a CSV file and insert into database accordingly

My project has this requirement where user uploads a CSV file which has to be pushed to mysql database. I know we can use Spring batch to process large number of records. But I'm not able to find any tutorial/sample code for this requirement of mine. All the tutorials which I came across just hardcoded the CSV file name in it like below:

https://spring.io/guides/gs/batch-processing/

I'll need to use the file uploaded by user and process it accordingly. Any help here would be appreciated..

If not with Spring batch, is there any other way to insert the uploaded CSV data to mysql?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Please have this as main reference: http://walkingtechie.blogspot.co.uk/2017/03/spring-batch-csv-file-to-mysql.html This explains how you use Batch to import a CSV file into a MySQL database.

However, as you said, all the example assume an hardcode file which is not what you want.

In the code below, the important bits (that differ from the example in the link I provided) are the Controller that take a multipart file and save it in a temporary folder. Then the file name is passed to the Job as parameter:

JobExecution jobExecution = jobLauncher.run(importUserJob, new JobParametersBuilder()
                .addString("fullPathFileName", fileToImport.getAbsolutePath())
                .toJobParameters());

Finally, the importReader uses the param fullPathFileName to load the file uploaded by the user:

      @Bean
      public FlatFileItemReader<Person> importReader(@Value("#{jobParameters[fullPathFileName]}") String pathToFile) {
        FlatFileItemReader<Person> reader = new FlatFileItemReader<>();
        reader.setResource(new FileSystemResource(pathToFile));

Here the full code (not tested but it has most of the components) to give you an idea:

@Configuration
@EnableBatchProcessing
public class BatchConfig{

    @Bean
    public ResourcelessTransactionManager batchTransactionManager(){
        ResourcelessTransactionManager transactionManager = new ResourcelessTransactionManager();
        return transactionManager;
    }

    @Bean
    protected JobRepository jobRepository(ResourcelessTransactionManager batchTransactionManager) throws Exception{
        MapJobRepositoryFactoryBean jobRepository = new MapJobRepositoryFactoryBean();
        jobRepository.setTransactionManager(batchTransactionManager);
        return (JobRepository)jobRepository.getObject();
    }

    @Bean
    public JobLauncher jobLauncher(JobRepository jobRepository) throws Exception {
        SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
        jobLauncher.setJobRepository(jobRepository);
        return jobLauncher;
    }

}

@Configuration
public class ImportJobConfig {

    @Bean
    public FlatFileItemReader<Person> importReader(@Value("#{jobParameters[fullPathFileName]}") String pathToFile) {
        FlatFileItemReader<Person> reader = new FlatFileItemReader<>();
        reader.setResource(new FileSystemResource(pathToFile));
        reader.setLineMapper(new DefaultLineMapper<Person>() {{
            setLineTokenizer(new DelimitedLineTokenizer() {{
                setNames(new String[]{"firstName", "lastName"});
            }});
            setFieldSetMapper(new BeanWrapperFieldSetMapper<Person>() {{
                setTargetType(Person.class);
            }});
        }});
        return reader;
    }

    @Bean
    public PersonItemProcessor processor() {
        return new PersonItemProcessor();
    }

    @Bean
    public JdbcBatchItemWriter<Person> writer() {
        JdbcBatchItemWriter<Person> writer = new JdbcBatchItemWriter<>();
        writer.setItemSqlParameterSourceProvider(
                new BeanPropertyItemSqlParameterSourceProvider<Person>());
        writer.setSql("INSERT INTO people (first_name, last_name) VALUES (:firstName, :lastName)");
        writer.setDataSource(dataSource);
        return writer;
    }
    // end::readerwriterprocessor[]

    // tag::jobstep[]
    @Bean
    public Job importUserJob(JobCompletionNotificationListener listener) {
        return jobBuilderFactory.get("importUserJob").incrementer(new RunIdIncrementer())
                .listener(listener).flow(step1()).end().build();
    }

    @Bean
    public Step step1(@Qualifier("importReader") ItemReader<Person> importReader) {
        return stepBuilderFactory.get("step1").<Person, Person>chunk(10).reader(importReader)
                .processor(processor()).writer(writer()).build();
    }

}

@RestController
public class MyImportController {

    @Autowired private JobLauncher jobLauncher;
    @Autowired private Job importUserJob;

    @RequestMapping(value="/import/file", method=RequestMethod.POST)
    public String create(@RequestParam("file") MultipartFile multipartFile) throws IOException{

        //Save multipartFile file in a temporary physical folder
        String path = new ClassPathResource("tmpuploads/").getURL().getPath();//it's assumed you have a folder called tmpuploads in the resources folder
        File fileToImport = new File(path + multipartFile.getOriginalFilename());
        OutputStream outputStream = new FileOutputStream(fileToImport);
        IOUtils.copy(multipartFile.getInputStream(), outputStream);
        outputStream.flush();
        outputStream.close();       

        //Launch the Batch Job
        JobExecution jobExecution = jobLauncher.run(importUserJob, new JobParametersBuilder()
                .addString("fullPathFileName", fileToImport.getAbsolutePath())
                .toJobParameters());        

        return "OK";
    }

}

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...