我正在使用docker映像在docker容器中运行Kafka。我正在从一个小型Java应用程序发出消息。但是,当我尝试使用来自终端的消息时,看不到任何消息被使用。我正在使用命令kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test-topic --from-beginning
来使用kafka docker容器中的消息。我也尝试过使用localhost:9094
查看消费消息。但是,我能够在/kafka/kafka-logs-kafka/test-topic-0
的Kafka容器中的主题日志中看到消息,因此我知道他们正在将它们添加到kafka。有人可以说出为什么消息在日志中但未被使用吗,它有什么解决方法?我已经能够在Docker容器中产生和使用消息。
这是我的docker-compose.yml
for Kafka
kafka:
hostname: kafka
image: wurstmeister/kafka:0.10.2.1
environment:
KAFKA_LISTENERS: INTERNAL://kafka:9092,OUTSIDE://kafka:9094
KAFKA_ADVERTISED_LISTENERS: INTERNAL://kafka:9092,OUTSIDE://localhost:9094
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INTERNAL:PLAINTEXT,OUTSIDE:PLAINTEXT
KAFKA_INTER_BROKER_LISTENER_NAME: INTERNAL
KAFKA_ZOOKEEPER_CONNECT: zk:2181
KAFKA_CREATE_TOPICS: test-topic:1:1
ports:
- "22181:2181"
- "9092:9092"
- "9094:9094"
我在其中生成消息的Java应用
public static void main( String[] args )
{
final String TOPIC = "test-topic";
final Producer<Long, String> producer = MessageProducer.createProducer();
long time = System.currentTimeMillis();
try {
for (long index = 0; index < 10; index++) {
final ProducerRecord<Long, String> record =
new ProducerRecord<>(TOPIC, index, "TEST!!!!! " + index);
RecordMetadata metadata = producer.send(record).get();
long elapsedTime = System.currentTimeMillis() - time;
System.out.printf("sent record(key=%s value=%s) " + "meta(partition=%d, offset=%d) time=%d\n",
record.key(), record.value(), metadata.partition(), metadata.offset(), elapsedTime);
}
} catch (InterruptedException e) {
e.printStackTrace();
} catch (ExecutionException e) {
e.printStackTrace();
} finally {
producer.flush();
producer.close();
}
}
class MessageProducer {
private final static String BOOTSTRAP_SERVERS = "localhost:9094";
static Producer<Long, String> createProducer() {
Properties props = new Properties();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, BOOTSTRAP_SERVERS);
props.put(ProducerConfig.CLIENT_ID_CONFIG, "KafkaProducer");
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, LongSerializer.class.getName());
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
return new KafkaProducer<>(props);
}
}