我正在运行ubuntu 18并使用java(intellij IDE)并编写基本的kafka应用程序。我正在尝试使用here中的基本示例,并将信息流式传输到应用程序,然后将某些内容打印到屏幕上,我使用intellij“run”命令运行应用程序。
当我将应用程序连接到输出流时,它工作正常,我设法将信息输出到终端。
我尝试在foreach方法中添加System.out.println()
,在apply方法中,它不起作用,我在其中添加断点并运行调试模式,并且它没有到达那里,我想流量不会到达那里在跑步期间。
我正在使用正确的主题将信息流式传输到应用程序,并且我在apply和foreach之外的应用程序中打印的内容正常工作。
每次向其传输信息时,如何让应用程序打印出来?主要思想是处理某些内容并将结果打印到监视器而不是kafka流
以下是代码:
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.common.serialization.Serdes;
import org.apache.kafka.streams.KafkaStreams;
import org.apache.kafka.streams.StreamsBuilder;
import org.apache.kafka.streams.StreamsConfig;
import org.apache.kafka.streams.kstream.*;
import org.apache.kafka.streams.kstream.Printed;
import java.util.Arrays;
import java.util.Locale;
import java.util.Properties;
import java.util.concurrent.CountDownLatch;
public class main {
public static void main(String[] args) throws Exception {
Properties props = new Properties();
props.put(StreamsConfig.APPLICATION_ID_CONFIG, "streams-wordcount");
props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(StreamsConfig.CACHE_MAX_BYTES_BUFFERING_CONFIG, 0);
props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
// setting offset reset to earliest so that we can re-run the demo code with the same pre-loaded data
// Note: To re-run the demo, you need to use the offset reset tool:
// https://cwiki.apache.org/confluence/display/KAFKA/Kafka+Streams+Application+Reset+Tool
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
StreamsBuilder builder = new StreamsBuilder();
KStream<String, String> source = builder.stream("streams-plaintext-input");
source.foreach(new ForeachAction<String, String>() {
@Override
public void apply(String key, String value) {
System.out.println("yeah");
}
});
KTable<String, Long> counts = source
.flatMapValues(new ValueMapper<String, Iterable<String>>() {
@Override
public Iterable<String> apply(String value) {
return Arrays.asList(value.toLowerCase(Locale.getDefault()).split(" "));
}
})
.groupBy(new KeyValueMapper<String, String, String>() {
@Override
public String apply(String key, String value) {
System.out.println("what");
return value;
}
})
.count();
//System.exit(0);
// need to override value serde to Long type
System.out.println("what");
//counts.toStream().to("streams-wordcount-output", Produced.with(Serdes.String(), Serdes.Long()));
final KafkaStreams streams = new KafkaStreams(builder.build(), props);
final CountDownLatch latch = new CountDownLatch(1);
// attach shutdown handler to catch control-c
Runtime.getRuntime().addShutdownHook(new Thread("streams-wordcount-shutdown-hook") {
@Override
public void run() {
streams.close();
latch.countDown();
}
});
try {
streams.start();
latch.await();
} catch (Throwable e) {
System.exit(2);
}
System.exit(0);
}
}
答案 0 :(得分:1)
KafkaStreams提供输出控制台打印dsl
KStream<String, String> source = builder.stream("streams-plaintext-input");
source.print(Printed.toSysOut());
KTable<String, Long> counts = source
.flatMapValues(new ValueMapper<String, Iterable<String>>() {
...
./ kafka-console-producer --broker-list localhost:9092 --topic streams-plaintext-input
input1
input2
Intellij控制台结果
[KSTREAM-SOURCE-0000000000]: null, input1
[KSTREAM-SOURCE-0000000000]: null, input2