在带有Kafka的Spring Boot中,我可以如下设置ConsumerFactory的属性:
@EnableKafka
@Configuration
public class KafkaConsumerConfig {
@Bean
public ConsumerFactory<String, EnrichedOrder> consumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.GROUP_ID_CONFIG, "barnwaldo");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
props.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "1000");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, EnrichedOrderDeserializer.class);
return new DefaultKafkaConsumerFactory<>(props);
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, EnrichedOrder> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, EnrichedOrder> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
}
使用Kafka Streams,我可以在代码中设置属性,如下所示:
final Properties streamsConfiguration = new Properties();
streamsConfiguration.put(StreamsConfig.APPLICATION_ID_CONFIG, "wordcount-lambda-example");
streamsConfiguration.put(StreamsConfig.CLIENT_ID_CONFIG, "wordcount-lambda-example-client");
streamsConfiguration.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
streamsConfiguration.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
streamsConfiguration.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
streamsConfiguration.put(StreamsConfig.COMMIT_INTERVAL_MS_CONFIG, 10 * 1000);
streamsConfiguration.put(StreamsConfig.CACHE_MAX_BYTES_BUFFERING_CONFIG, 0);
final KafkaStreams streams = new KafkaStreams(builder.build(), streamsConfiguration);
在使用Spring Cloud Streams和Kafka Streams时,所有属性似乎只能通过资源文件夹中的application.properties或application.yml文件输入,例如
spring.cloud.stream.bindings:
output:
contentType: application/json
destination: data2
input:
contentType: application/json
destination: data1
spring.cloud.stream.kafka.streams:
binder:
brokers: localhost
configuration:
commit.interval.ms: 1000
default.key.serde: org.apache.kafka.common.serialization.Serdes$StringSerde
default.value.serde: org.apache.kafka.common.serialization.Serdes$StringSerde
bindings.input.consumer:
applicationId: data-tester
当将Spring Cloud Streams与Kafka Streams一起使用时,是否可以在HashMap或Properties中包含属性。
也许可以通过KafkaMessageChannelBinder或通过扩展AbstractMessageChannelBinder来完成此操作-参见https://github.com/spring-cloud/spring-cloud-stream-binder-kafka/blob/7355ada4613ad50fe95430f1859d4ea65f004be1/spring-cloud-stream-binder-kafka/src/main/java/org/springframework/cloud/stream/binder/kafka/KafkaMessageChannelBinder.java。
我找不到与此相关的文档;任何帮助都将不胜感激。
答案 0 :(得分:0)
默认情况下,它在活页夹级别具有支持,该属性应在属性前加上spring.cloud.stream.kafka.streams.binder.
文字
如果看到KafkaStreamsBinderSupportAutoConfiguration
类,则可以看到从yaml属性读取并设置为kafka流的bean配置。
答案 1 :(得分:0)
谢谢您的回答-我经常引用第一个参考链接,并将通过Github链接复审KafkaStreamsBinderSupportAutoConfiguration类。
也许我可以问一个关于属性的更具体的问题...
我的理解是,Kafka流可以通过配置Kafka Streams来加密传输中的数据(与目标Kafka集群通信时)并启用客户端身份验证,从而执行安全的流处理。在Spring Cloud Streams Kafka Streams实现中,如何在applications.yml属性文件中实现以下功能?具体是SECURITY和SSL属性?
SELECT Employee.FirstName
, hrt.Social
, pt.Sallary
FROM Employee e
LEFT JOIN HRTable hrt ON e.ID = hrt.ID
LEFT JOIN PayrollTable pt ON pt.ID = e.ID
AND pt.ID = @EmployeeID
WHERE e.ID = @EmployeeID
再次感谢您的帮助。