我正在尝试使用spring-boot
和spring嵌入式Kafka进行集成测试。我能够向Spring嵌入式Kafka服务器生成消息,但是服务类中的监听器正在尝试消耗记录和
KafkaProducerConfigTest 包含所有bean的Config类
@EnableKafka
@TestConfiguration
public class KafkaProducerConfigTest {
@Bean
public EmbeddedKafkaBroker embeddedKafkaBroker() {
return new EmbeddedKafkaBroker(1,false,2,"test-events");
}
@Bean
public ProducerFactory<String, Object> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafkaBroker().getBrokersAsString());
props.put(ProducerConfig.RETRIES_CONFIG, 0);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return new DefaultKafkaProducerFactory<>(props);
}
@Bean
public KafkaTemplate<String, Object> kafkaTemplate() {
KafkaTemplate<String, Object> kafkaTemplate = new KafkaTemplate<>(producerFactory());
return kafkaTemplate;
}
@Bean("consumerFactory")
public ConsumerFactory<String, Object> createConsumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafkaBroker().getBrokersAsString());
props.put(ConsumerConfig.GROUP_ID_CONFIG, "group1");
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true);
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(Object.class,false));
}
@Bean("kafkaListenerContainerFactory")
public ConcurrentKafkaListenerContainerFactory<String, Object> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Object> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(createConsumerFactory());
factory.setBatchListener(true);
factory.setMessageConverter(new BatchMessagingMessageConverter(converter()));
factory.getContainerProperties().setAckMode(AckMode.BATCH);
return factory;
}
@Bean
public StringJsonMessageConverter converter() {
return new StringJsonMessageConverter();
}
@Bean
public Listener listener() {
return new Listener();
}
public class Listener {
public final CountDownLatch latch = new CountDownLatch(1);
@Getter
public List<Professor> list;
@KafkaListener(topics = "test-events", containerFactory = "kafkaListenerContainerFactory")
public void listen1(List<Professor> foo) {
list=foo;
this.latch.countDown();
}
}
}
KafkaProducerServiceTest 服务测试类,但这里的侦听器未使用任何数据,我知道此测试将失败,因为这是不正确的方法
@EnableKafka
@SpringBootTest(classes = { KafkaProducerConfigTest.class })
@RunWith(SpringRunner.class)
public class KafkaProducerServiceTest {
@Autowired
private KafkaConsumerService kafkaConsumerService;
@Autowired
private Listener listener;
@Test
public void testReceive() throws Exception {
kafkaConsumerService.professor(Arrays.asList(new Professor("Ajay", new Department("social", 1234))));
System.out.println("The professor object is sent to kafka -----------------------------------");
listener.latch.await();
System.out.println(listener.getList());
}
}
错误
2019-02-19 15:18:32.620 ERROR 22387 --- [ntainer#1-0-C-1] o.s.k.listener.BatchLoggingErrorHandler : Error while processing:
ConsumerRecord(topic = test-events, partition = 1, offset = 0, CreateTime = 1550611112583, serialized key size = 9, serialized value size = 64, headers = RecordHeaders(headers = [], isReadOnly = false), key = professor, value = {name=Ajay, department={deptName=social, deptId=1234}})
java.lang.IllegalStateException: Only String or byte[] supported
at org.springframework.kafka.support.converter.StringJsonMessageConverter.extractAndConvertValue(StringJsonMessageConverter.java:140) ~[spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
at org.springframework.kafka.support.converter.MessagingMessageConverter.toMessage(MessagingMessageConverter.java:134) ~[spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
at org.springframework.kafka.support.converter.BatchMessagingMessageConverter.convert(BatchMessagingMessageConverter.java:217) ~[spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
at org.springframework.kafka.support.converter.BatchMessagingMessageConverter.toMessage(BatchMessagingMessageConverter.java:165) ~[spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
at org.springframework.kafka.listener.adapter.BatchMessagingMessageListenerAdapter.toMessagingMessage(BatchMessagingMessageListenerAdapter.java:174) ~[spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
at org.springframework.kafka.listener.adapter.BatchMessagingMessageListenerAdapter.onMessage(BatchMessagingMessageListenerAdapter.java:129) ~[spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
at org.springframework.kafka.listener.adapter.BatchMessagingMessageListenerAdapter.onMessage(BatchMessagingMessageListenerAdapter.java:59) ~[spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeBatchListener(KafkaMessageListenerContainer.java:984) [spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeBatchListener(KafkaMessageListenerContainer.java:917) [spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:900) [spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:753) [spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_191]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_191]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_191]
根据@Gray Russel先生的回答进行了更新,但是我仍然有问题
答案 0 :(得分:1)
在容器工厂中使用StringDeserializer
和StringJsonMessageConverter
(或BytesDeserializer
和BytesJsonMessageConverter
);然后,框架可以根据方法签名确定目标类型,并将其传递给转换器。
解串器离堆栈太远了,无法进行推断。
答案 1 :(得分:0)
对此讨论做出回应可能为时已晚。 正如@Gary Russell在他的答复中提到的那样,只需使用StringDeserializer对对象进行反序列化。
application.yml以配置Kafka:
server:
port: 8080
servlet:
contextPath: /poi
spring:
kafka:
consumer:
bootstrap-servers: localhost:9092
group-id: rsvp-consumers
auto-offset-reset: earliest
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
properties:
spring:
json:
type:
mapping: poi:org.ajeet.learnings.spring.springboot.model.POI
producer:
bootstrap-servers: localhost:9092
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
properties:
spring:
json:
type:
mapping: poi:org.ajeet.learnings.spring.springboot.model.POI
topics: spatial
Kafka事件类别:
public final class POI {
private final double longitude;
private final double latitude;
private final POI_Type type;
private final String description;
@JsonCreator
public POI(@JsonProperty("longitude") double longitude,
@JsonProperty("latitude") double latitude,
@JsonProperty("type") POI_Type type,
@JsonProperty("description") String description) {
this.longitude = longitude;
this.latitude = latitude;
this.type = type;
this.description = description;
}
public double getLongitude() {
return longitude;
}
public double getLatitude() {
return latitude;
}
public String getDescription() {
return description;
}}
在您的 application 类寄存器 StringJsonMessageConverter 中(您也可以在其他地方创建该bean),这会将字符串转换为必需的对象,即POI:
@SpringBootApplication
@ComponentScan("org.ajeet.learnings.spring")
public class SpringBootKafkaApplication {
public static void main(String[] args) {
SpringApplication.run(SpringBootKafkaApplication.class, args);
}
/**
* We are using a StringDeserializer in kafka consumer properties
* And this converter is converting the message from string to required type.
*
* @return RecordMessageConverter
*/
@Bean
public RecordMessageConverter converter() {
return new StringJsonMessageConverter();
}}
这是消费者:
@Service
public final class POIEventConsumer {
private final Logger LOG = LoggerFactory.getLogger(POIEventConsumer.class);
/**
* Read topic names from application.yaml
*
* @param pointOfInterest
* @throws IOException
*/
@KafkaListener(topics = "#{'${spring.kafka.topics}'.split(',')}")
public void consume(POI pointOfInterest) throws IOException {
LOG.info(pointOfInterest.toString());
} }
这是生产者:
@Service
public class POIEventProducer {
private static final Logger LOG = LoggerFactory.getLogger(POIEventProducer.class);
@Value("${spring.kafka.topics}")
private String topic;
@Autowired
private KafkaTemplate<String, POI> kafkaTemplate;
public void sendMessage(POI pointOfInterest) {
this.kafkaTemplate.send(topic, pointOfInterest);
}}
您可以使用控制器对其进行测试:
@RestController
@RequestMapping(value = "/kafka")
public final class POIEventController {
private final Logger LOG = LoggerFactory.getLogger(POIEventController.class);
@Autowired
private POIEventProducer poiEventProducer;
/**
* Post request url: http://localhost:8080/poi/kafka/publish
* Example of request body: {"longitude":77.100281, "latitude": 28.556160, "type": "Airport", "description": "Indira Gandhi International Airport New Delhi"}
*
* @param pointOfInterest
*/
@RequestMapping(value = "/publish",
method = RequestMethod.POST,
consumes = MediaType.APPLICATION_JSON_VALUE)
public void sendMessageToKafkaTopic(@RequestBody POI pointOfInterest) {
this.poiEventProducer.sendMessage(pointOfInterest);
}}