我目前通过Kafka Producer发送压缩消息,方法是使用以下属性启用压缩:
实例化kafkaproducer对象时,ProducerConfig对象中列出了以下属性列表:
{compression.type=gzip, metric.reporters=[], metadata.max.age.ms=300000, metadata.fetch.timeout.ms=60000, reconnect.backoff.ms=50, sasl.kerberos.ticket.renew.window.factor=0.8, bootstrap.servers=[MyServer_IP:9092], retry.backoff.ms=100, sasl.kerberos.kinit.cmd=/usr/bin/kinit, buffer.memory=33554432, timeout.ms=30000, key.serializer=class org.apache.kafka.common.serialization.StringSerializer, sasl.kerberos.service.name=null, sasl.kerberos.ticket.renew.jitter=0.05, ssl.keystore.type=JKS, ssl.trustmanager.algorithm=PKIX, block.on.buffer.full=false, ssl.key.password=null, max.block.ms=60000, sasl.kerberos.min.time.before.relogin=60000, connections.max.idle.ms=540000, ssl.truststore.password=null, max.in.flight.requests.per.connection=5, metrics.num.samples=2, client.id=, ssl.endpoint.identification.algorithm=null, ssl.protocol=TLS, request.timeout.ms=30000, ssl.provider=null, ssl.enabled.protocols=[TLSv1.2, TLSv1.1, TLSv1], acks=1, batch.size=16384, ssl.keystore.location=null, receive.buffer.bytes=32768, ssl.cipher.suites=null, ssl.truststore.type=JKS, security.protocol=PLAINTEXT, retries=0, max.request.size=1048576, value.serializer=class org.apache.kafka.common.serialization.ByteArraySerializer, ssl.truststore.location=null, ssl.keystore.password=null, ssl.keymanager.algorithm=SunX509, metrics.sample.window.ms=30000, partitioner.class=class org.apache.kafka.clients.producer.internals.DefaultPartitioner, send.buffer.bytes=131072, linger.ms=0}
问题:
在此列表中,“compression.type = gzip”显然已根据需要设置,但缺少“compressed.topics”。它导致所有主题的压缩,但我需要选择性。
调查结果:
我调试代码并且发现在ProducerConfig.java类中没有定义“compressed.topics”属性,因为当实例化KafkaProducer对象时它没有必需的属性。