我想在单个API调用中发送并从Kafka获取数据(参见下图)。
这可能吗?我已经知道如何使数据向一个方向发展(例如,Spark Streaming使用Kafka使用者API读取数据)。我也知道如何通过两种单向方法来“伪造它”(例如,网络应用程序既是生产者又是消费者)。但是,当Web应用程序进行API调用时,我只希望它必须处理自己的记录,而不是主题中的所有记录,所以这似乎是错误的方法。
我想到的其他次优方法:
有什么建议吗?
答案 0 :(得分:2)
What I did is....
The downside of this approach is that the issues are not deleted immediately .
答案 1 :(得分:0)
我会建议你第三个,但有2个主题:1个用于请求,1个用于响应。这是一个例子:
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Properties;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.TimeUnit;
import kafka.consumer.ConsumerConfig;
import kafka.consumer.ConsumerIterator;
import kafka.consumer.KafkaStream;
import kafka.javaapi.consumer.ConsumerConnector;
import kafka.javaapi.producer.Producer;
import kafka.producer.KeyedMessage;
import kafka.producer.ProducerConfig;
public class ConsumerGroupExample extends Thread {
private final ConsumerConnector consumer;
private final String topic;
private ConsumerIterator<byte[], byte[]> it;
private String mensaje="";
public ConsumerGroupExample(Properties props, String a_topic)
{
consumer = kafka.consumer.Consumer.createJavaConsumerConnector(new ConsumerConfig(props));
this.topic = a_topic;
Map<String, Integer> topicCountMap = new HashMap<String, Integer>();
topicCountMap.put(topic, 1);
Map<String, List<KafkaStream<byte[], byte[]>>> consumerMap = consumer.createMessageStreams(topicCountMap);
List<KafkaStream<byte[], byte[]>> streams = consumerMap.get(topic);
KafkaStream stream = streams.get(0);
it = stream.iterator();
}
public void shutdown()
{
if (consumer != null) consumer.shutdown();
}
public void run()
{
if (it.hasNext())
{
mensaje = new String(it.next().message());
}
System.out.println( mensaje );
}
public String getMensaje()
{
return this.mensaje;
}
public static void main(String[] args) {
Properties props = new Properties();
props.put("zookeeper.connect", "localhost:2181");
props.put("group.id", "Group");
props.put("zookeeper.session.timeout.ms", "400");
props.put("zookeeper.sync.time.ms", "200");
props.put("auto.commit.interval.ms", "1000");
props.put("consumer.timeout.ms", "10000");
ConsumerGroupExample example = new ConsumerGroupExample( props, "topicFoRResponse");
props = new Properties();
props.put("metadata.broker.list", "localhost:9092");
props.put("serializer.class", "kafka.serializer.StringEncoder");
props.put("request.required.acks", "1");
ProducerConfig config = new ProducerConfig(props);
example.start();
try {
Producer<String, String> colaParaEscritura;
KeyedMessage<String, String> data = new KeyedMessage<String, String>("topicForRequest", " message ");
colaParaEscritura = new kafka.javaapi.producer.Producer<String, String>(config);
colaParaEscritura.send(data);
System.out.println("enviado");
colaParaEscritura.close();
example.join();
System.out.println( "final"+ example.getMensaje() );
}
catch (InterruptedException ie) {
}
example.shutdown();
}
}