我正在尝试使用Spring Binding绑定两个频道
public interface Binding{
String INPUT = "request";
String OUTPUT = "response";
@Input(INPUT)
KStream<?, ?> request();
@Output(OUTPUT)
KStream<?, ?> response();
}
application.properties为; spring.kafka.client-id = h-repo
# ===============================
# = STREAM APPLICATION PROPERTIES
# ===============================
spring.cloud.stream.kafka.streams.binder.application-id=h-repository
spring.cloud.stream.kafka.streams.binder.configuration.commit.interval.ms=1000
spring.cloud.stream.kafka.streams.binder.configuration.spring.json.trusted.packages=org.packages.model
# ===============================
# = REQUEST STREAM(S) PROCESSOR
# ============================= ==
spring.cloud.stream.bindings.request.destination=request
spring.cloud.stream.bindings.request.group=h-request
spring.cloud.stream.bindings.request.contentType=application/java-serialized-object
spring.cloud.stream.bindings.request.consumer.use-native-decoding=true
spring.cloud.stream.bindings.request.consumer.partitioned=true
spring.cloud.stream.bindings.request.consumer.instance-count=1
spring.cloud.stream.bindings.request.consumer.instance-index=0
spring.cloud.stream.kafka.bindings.request.consumer.auto-rebalance-enabled=false
spring.cloud.stream.kafka.bindings.request.consumer.auto-commit-offset=true
spring.cloud.stream.kafka.bindings.request.consumer.ack-each-record=true
spring.cloud.stream.kafka.bindings.request.consumer.configuration.max.poll.records=30
spring.cloud.stream.kafka.streams.bindings.request.consumer.keySerde=org.springframework.kafka.support.serializer.JsonSerde
spring.cloud.stream.kafka.streams.bindings.request.consumer.valueSerde=org.springframework.kafka.support.serializer.JsonSerde
spring.cloud.stream.bindings.response.destination=response
spring.cloud.stream.bindings.response.group=h-response
spring.cloud.stream.bindings.response.contentType=application/java-serialized-object
spring.cloud.stream.bindings.response.producer.use-native-encoding=true
spring.cloud.stream.bindings.response.producer.partition-count=20
spring.cloud.stream.bindings.response.producer.partition-key-extractor-name=kafkaPartitionKeyExtractor
spring.cloud.stream.kafka.streams.bindings.response.producer.keySerde=org.springframework.kafka.support.serializer.JsonSerde
spring.cloud.stream.kafka.streams.bindings.response.producer.valueSerde=org.springframework.kafka.support.serializer.JsonSerde
首先,我尝试了上述配置,但由于我误解了分区流而无法正常工作,现在我正尝试使用以下方法配置流Produce / Consumer。
spring.cloud.stream.kafka.bindings.<channel>.consumer/producer.configuration.
我第一次尝试配置一个我从此处https://github.com/spring-cloud/spring-cloud-stream-binder-kafka/issues/240的示例中选择的任何属性
它没有用,最终目标是能够配置使用者/生产者分区。