如何将数据从队列写入MongoDB

时间:2019-05-29 08:51:29

标签: java mongodb apache-kafka queue

我有一个正在运行的Zookeeper + kafka,并且已经成功向kafka生产者发送了推文。这些推文来自一个队列:

queue = new LinkedBlockingQueue<>(10000);
public void run() {
        client.connect();
        try (Producer<Long, String> producer = getProducer()) {
            while (true) {
                Tweet tweet = gson.fromJson(queue.take(), Tweet.class);
                System.out.printf("Fetched tweet id %d\n", tweet.getId());
                long key = tweet.getId();
                String msg = tweet.toString();
                ProducerRecord<Long, String> record = new ProducerRecord<>(KafkaConfiguration.TOPIC, key, msg);
                producer.send(record, callback);


            }
        } catch (InterruptedException e) {
            e.printStackTrace();
        } finally {
            client.stop();
        }

我的问题是如何将已经接收的对象(类Tweet)写入MongoDB?我在本地主机上有这样的配置设置:

//MongoDB config
    int port_no = 27017;
    String host_name = "localhost", db_name = "bigdata", db_coll_name = "twitter";

    // Mongodb connection string.
    String client_url = "mongodb://" + host_name + ":" + port_no + "/" + db_name;
    MongoClientURI uri = new MongoClientURI(client_url);

    // Connecting to the mongodb server using the given client uri.
    MongoClient mongo_client = new MongoClient(uri);

    // Fetching the database from the mongodb.
    MongoDatabase db = mongo_client.getDatabase(db_name);

    // Fetching the collection from the mongodb.
    MongoCollection<Document> coll = db.getCollection(db_coll_name);

是否可以通过JSON反序列化?任何建议将不胜感激。预先感谢。

1 个答案:

答案 0 :(得分:0)

我建议使用Kafka Connect MongoDB Sink Connector以便将数据从Kafka推送到MongoDB

带有模式的JSON的配置示例:

key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=true

value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=true

如果您使用的是Confluent Hub,则可以找到有关如何安装连接器here的说明。