使用Spring云流的Kafka绑定的Message Producer
@Component
public static class PageViewEventSource implements ApplicationRunner {
private final MessageChannel pageViewsOut;
private final Log log = LogFactory.getLog(getClass());
public PageViewEventSource(AnalyticsBinding binding) {
this.pageViewsOut = binding.pageViewsOut();
}
@Override
public void run(ApplicationArguments args) throws Exception {
List<String> names = Arrays.asList("priya", "dyser", "Ray", "Mark", "Oman", "Larry");
List<String> pages = Arrays.asList("blog", "facebook", "instagram", "news", "youtube", "about");
Runnable runnable = () -> {
String rPage = pages.get(new Random().nextInt(pages.size()));
String rName = pages.get(new Random().nextInt(names.size()));
PageViewEvent pageViewEvent = new PageViewEvent(rName, rPage, Math.random() > .5 ? 10 : 1000);
Serializer<PageViewEvent> serializer = new JsonSerde<>(PageViewEvent.class).serializer();
byte[] m = serializer.serialize(null, pageViewEvent);
Message<byte[]> message = MessageBuilder
.withPayload(m).build();
try {
this.pageViewsOut.send(message);
log.info("sent " + message);
} catch (Exception e) {
log.error(e);
}
};
Executors.newScheduledThreadPool(1).scheduleAtFixedRate(runnable, 1, 1, TimeUnit.SECONDS);
}
此在序列化下使用
spring.cloud.stream.kafka.streams.binder.configuration.default.key.serde = org.apache.kafka.common.serialization.Serdes $ StringSerde spring.cloud.stream.kafka.streams.binder.configuration.default.value.serde = org.apache.kafka.common.serialization.Serdes $ BytesSerde
我正在尝试通过Spring Kafka-KafkaListener在单独的Consumer Application中使用这些消息
@Service
public class PriceEventConsumer {
private static final Logger LOG = LoggerFactory.getLogger(PriceEventConsumer.class);
@KafkaListener(topics = "test1" , groupId = "json", containerFactory = "kafkaListenerContainerFactory")
public void receive(Bytes data){
//public void receive(@Payload PageViewEvent data,@Headers MessageHeaders headers) {
LOG.info("Message received");
LOG.info("received data='{}'", data);
}
容器出厂配置
@Bean
public ConsumerFactory<String, Bytes> consumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
BytesDeserializer.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "json");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return new DefaultKafkaConsumerFactory<>(props);
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, Bytes>
kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Bytes> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
使用此配置,使用者不接收消息(字节)。如果我将Kafka监听器更改为接受String,那么它会给我以下异常:
@KafkaListener(topics = "test1" , groupId = "json", containerFactory = "kafkaListenerContainerFactory")
public void receive(String data){
LOG.info("Message received");
LOG.info("received data='{}'", data);
}
原因:
org.springframework.messaging.converter.MessageConversionException:无法处理消息。嵌套异常是org.springframework.messaging.converter.MessageConversionException:无法从GenericMessage [payload = {“ userId”:“ facebook”的[org.apache.kafka.common.utils.Bytes]转换为[java.lang.String] ,“页面”:“关于”,“持续时间”:10},标题= {kafka_offset = 4213,kafka_consumer = brave.kafka.clients.TracingConsumer @ 9a75f94,kafka_timestampType = CREATE_TIME,kafka_receiveddition = null,kafka_resetId = 1, ,kafka_receivedTimestamp = 1553007593670}],failedMessage = GenericMessage [payload = {“ userId”:“ facebook”,“ page”:“ about”,“ duration”:10},标头= {kafka_offset = 4213,kafka_consumer = brave.kafka。 client.TracingConsumer @ 9a75f94,kafka_timestampType = CREATE_TIME,kafka_receivedMessageKey = null,kafka_receivedPartitionId = 0,kafka_receivedTopic = test1,kafka_receivedTimestamp = 1553007593670}] ...另外23个 由以下原因引起:org.springframework.messaging.converter.MessageConversionException:无法从GenericMessage的[org.apache.kafka.common.utils.Bytes]转换为[java.lang.String] [payload = {“ userId”:“ facebook” ,“页面”:“关于”,“持续时间”:10},标题= {kafka_offset = 4213,kafka_consumer = brave.kafka.clients.TracingConsumer @ 9a75f94,kafka_timestampType = CREATE_TIME,kafka_receiveddition = null,kafka_resetId = 1, ,kafka_receivedTimestamp = 1553007593670}] 在org.springframework.messaging.handler.annotation.support.PayloadArgumentResolver.resolveArgument(PayloadArgumentResolver.java:144)〜[spring-messaging-5.1.4.RELEASE.jar:5.1.4.RELEASE] 在org.springframework.messaging.handler.invocation.HandlerMethodArgumentResolverComposite.resolveArgument(HandlerMethodArgumentResolverComposite.java:117)〜[spring-messaging-5.1.4.RELEASE.jar:5.1.4.RELEASE] 在org.springframework.messaging.handler.invocation.InvocableHandlerMethod.getMethodArgumentValues(InvocableHandlerMethod.java:147)〜[spring-messaging-5.1.4.RELEASE.jar:5.1.4.RELEASE] 在org.springframework.messaging.handler.invocation.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:116)〜[spring-messaging-5.1.4.RELEASE.jar:5.1.4.RELEASE] 在org.springframework.kafka.listener.adapter.HandlerAdapter.invoke(HandlerAdapter.java:48)〜[spring-kafka-2.2.3.RELEASE.jar:2.2.3.RELEASE] 在org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:283)〜[spring-kafka-2.2.3.RELEASE.jar:2.2.3.RELEASE] ...另外22个
任何指针都将非常有帮助。
更新POJO零件
Pojo Part-
@KafkaListener(topics = "test1" , groupId = "json", containerFactory = "kafkaListenerContainerFactory")
public void receive(@Payload PageViewEvent data,@Headers MessageHeaders headers) {
LOG.info("Message received");
LOG.info("received data='{}'", data);
}
容器出厂配置
@Bean
public ConsumerFactory<String,PageViewEvent > priceEventConsumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "json");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(PageViewEvent.class));
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, PageViewEvent> priceEventsKafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, PageViewEvent> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(priceEventConsumerFactory());
return factory;
}
制作人-
@Override
public void run(ApplicationArguments args) throws Exception {
List<String> names = Arrays.asList("priya", "dyser", "Ray", "Mark", "Oman", "Larry");
List<String> pages = Arrays.asList("blog", "facebook", "instagram", "news", "youtube", "about");
Runnable runnable = () -> {
String rPage = pages.get(new Random().nextInt(pages.size()));
String rName = pages.get(new Random().nextInt(names.size()));
PageViewEvent pageViewEvent = new PageViewEvent(rName, rPage, Math.random() > .5 ? 10 : 1000);
Message<PageViewEvent> message = MessageBuilder
.withPayload(pageViewEvent).build();
try {
this.pageViewsOut.send(message);
log.info("sent " + message);
} catch (Exception e) {
log.error(e);
}
};
答案 0 :(得分:0)
您可以将记录从kfka反序列化为POJO,对于<2.2.x版,请使用MessageConverter
从2.2版开始,可以使用具有布尔值的重载构造函数之一,将反序列化器显式配置为使用提供的目标类型并忽略标头中的类型信息
@Bean
public ConsumerFactory<String,PageViewEvent > priceEventConsumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "json");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(PageViewEvent.class,false));
}
或使用MessageConverter
@Bean
public ConcurrentKafkaListenerContainerFactory<String, Bytes> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Bytes> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setMessageConverter(new StringJsonMessageConverter());
return factory;
}
答案 1 :(得分:0)
使用软件包computed:
{
// combining filters using OR
filteredJobs()
{
const positionFilter = this.positionFilter
const locationFilter = this.locationFilter
return positionFilter.length + locationFilter.length > 0
? this.jobs.filter(job => positionFilter.includes(job.position) || locationFilter.includes(job.location))
: this.jobs;
},
// combining filters using AND
filteredJobs()
{
const filter = {}
if (this.positionFilter.length > 0) filter.position = this.positionFilter
if (this.locationFilter.length > 0) filter.location = this.locationFilter
const list = Object.entries(filter);
return list.length > 0
? this.jobs.filter(job => list.every(([key, options]) => options.includes(job[key])))
: this.jobs;
}
}
中的JsonDeserializer
也信任json解串器程序包org.springframework.kafka.support.serializer