Spring-boot批处理文件读取并发送数据到Kafka

时间:2017-07-10 13:07:06

标签: spring-boot spring-batch spring-kafka

我无法将数据从CSV文件发送到Kafka。这是我用于批处理的writer.java的代码

import java.util.List;

import javax.persistence.criteria.CriteriaBuilder.In;

import org.springframework.batch.item.ItemWriter;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.kafka.core.KafkaTemplate;

import com.codenotfound.kafka.repository.*;

import java.util.*;

import com.codenotfound.kafka.Car;
import com.codenotfound.kafka.producer.Sender;


public class Writer implements ItemWriter<Car>{

    private final Repository repo;

    public Writer(Repository repo) {
        this.repo = repo ;       
    }

    @Override
    public void write(List<? extends Car> car) throws Exception {
        repo.save(car);
    }   
}

因此,我希望将此车类详细信息发送给Kafka,而不是 repo.save(car)。 这是我的Car类和Repository接口

@Entity
@Table(name = "Car")

public class Car {


  private String make;

  private String manufacturer;
  @Id
  @GeneratedValue(strategy = GenerationType.AUTO)
  private long id;

  public Car() {
    //super();
  }

  public Car(String make, String manufacturer) {
    super();
    this.make = make;
    this.manufacturer = manufacturer;
  }

  public String getMake() {
    return make;
  }

  public void setMake(String make) {
    this.make = make;
  }

  public String getManufacturer() {
    return manufacturer;
  }


  public void setManufacturer(String manufacturer) {
    this.manufacturer = manufacturer;
  }

  public long getId() {
    return id;
  }


  public void setId(long id) {
    this.id = id;
  }

  @Override
  public String toString() {
    return "Car [make=" + make + ", manufacturer=" + manufacturer + ", id=" + id + "]";
  }
}   

和Repository类

public interface Repository extends CrudRepository<Car, Long>,CustomRepository {
}

Kafka的My Sender文件是:

package com.codenotfound.kafka.producer;

import java.util.List;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.kafka.core.KafkaTemplate;

import com.codenotfound.kafka.Car;

public class Sender {

  private static final Logger LOGGER = LoggerFactory.getLogger(Sender.class);

  @Value("${topic.json}")
  private String jsonTopic;

  @Autowired
  private KafkaTemplate<String, Car> kafkaTemplate;

  public void send(Car car) {
    LOGGER.info("sending car='{}'", car.toString());
    kafkaTemplate.send(jsonTopic, car);
  }
}

请向我建议一种从CSV文件向我的Kafka发送数据的方法。

1 个答案:

答案 0 :(得分:0)

看来你已经为你准备了所有拼图。你需要做的是使用你的Sender类改变你的ItemWriter,所以你有这样的东西:

@Component
public class Writer implements ItemWriter<Car> {

    @Value("${topic.json}")
    private String jsonTopic;

    @Autowired
    private KafkaTemplate<String, Car> kafkaTemplate;

    @Override
    public void write(List<? extends Car> cars) throws Exception {
        cars.forEach(car -> kafkaTemplate.send(jsonTopic, car));
    }
}

在你的工作配置中,你需要像这样自动装配它(下面的代码是简化的,只是展示了如何在步骤中声明编写器):

@Configuration
@EnableBatchProcessing
public class BatchConfiguration {

    @Autowired
    public StepBuilderFactory stepBuilderFactory;

    @Autowired
    private Writer carWriter;

    @Bean
    public Step myStep() {
        this.stepBuilderFactory.get("myStep")
            .<Car,Car> chunk(10)
            .reader(reader())
            .writer(carWriter) // don't use new here
            .build();
    }
}