是否可以从通用案例类创建编解码器提供程序?

时间:2019-07-05 04:46:49

标签: mongodb scala mongo-scala-driver

我正在尝试创建一个泛型函数,该泛型函数从给定的泛型案例类生成CodecProvider

BSON macro documentation没有给出任何示例。

This (unanswered) SO question与之类似,但是我对枚举给定类型参数的所有可能的编解码器不感兴趣。另外,我的问题不涉及类型界限或类型方差。

这是无法编译的代码的最小示例。

import org.mongodb.scala.bson.codecs.Macros

case class Foo(x: Int)

case class Bar[T](x: T)

def fooCodecProvider = Macros.createCodecProvider[Foo]()
// Compiles! (No generic)

def barCodecProvider[T] = Macros.createCodecProvider[Bar[T]]()
// Compile Error:(8, 70) class Bar takes type parameters

我希望barCodecProvider能够编译,但是不会编译。

以上代码引发的编译错误显示为class Bar takes type parameters,这令人困惑,因为我已经通过通用T的签名向Bar提供了类型参数barCodecProviderfetch('https://service-now.com/api/now/table/table_name_here', { method: 'GET', headers: { 'Content-type': 'application/json', 'Authorization', 'Basic ' + btoa(username + ":" + password) }); 功能。我有与打字有关的语法错误吗?错误是我不正确使用mongo-scala-driver的迹象吗?

1 个答案:

答案 0 :(得分:1)

与此同时,其他库也可以通过隐式查找(例如circe)进行操作。 package ch.demo.toys; import java.io.FileInputStream; import java.time.Duration; import java.util.ArrayList; import java.util.Arrays; import java.util.Collections; import java.util.Iterator; import java.util.List; import java.util.Properties; import java.util.stream.Collectors; import org.apache.kafka.clients.consumer.Consumer; import org.apache.kafka.clients.consumer.ConsumerConfig; import org.apache.kafka.clients.consumer.ConsumerRecords; import org.apache.kafka.clients.consumer.KafkaConsumer; import org.apache.kafka.common.TopicPartition; public class CarthusianConsumer { private static Properties getProperties() throws Exception { Properties properties = new Properties(); properties.load(new FileInputStream("toy.properties")); properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.IntegerDeserializer.class); properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.StringDeserializer.class); properties.put(ConsumerConfig.MAX_POLL_RECORDS_CONFIG, Integer.MAX_VALUE); properties.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, 60 * 1000); properties.put(ConsumerConfig.GROUP_ID_CONFIG, "carthusian-consumer"); properties.put(ConsumerConfig.FETCH_MAX_WAIT_MS_CONFIG, 60 * 1000); properties.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false); properties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest"); properties.put(ConsumerConfig.FETCH_MAX_BYTES_CONFIG, 1024 * 1024 * 1024); return properties; } private static boolean checkConsistency(List<Integer> sequence) { Collections.sort(sequence); Iterator<Integer> iterator = sequence.iterator(); int control = 0; while(iterator.hasNext()) { int value = iterator.next(); if (value != control) { System.out.println(""); System.out.println("Gap found:"); System.out.println("\tSequence: " + value); System.out.println("\tControl: " + control); return false; } control++; } System.out.print("."); return true; } public static void main(String[] args) throws Exception { // Step 1: create a base consumer object Consumer<Integer, String> consumer = new KafkaConsumer<>(getProperties()); // Step 2: load topic configuration and build list of TopicPartitons List<TopicPartition> topicPartitions = consumer .partitionsFor("sequence") .stream() .parallel() .map(partitionInfo -> new TopicPartition(partitionInfo.topic(), partitionInfo.partition())) .collect(Collectors.toList()); while (true) { List<Integer> sequence = new ArrayList<>(); for (TopicPartition topicPartition : topicPartitions) { // Step 3. specify the topic-partition to "read" from // System.out.println("Partition specified: " + topicPartition); consumer.assign(Arrays.asList(topicPartition)); // Step 4. set offset at the beginning consumer.seekToBeginning(Arrays.asList(topicPartition)); // Step 5. get all records from topic-partition ConsumerRecords<Integer, String> records = consumer.poll(Duration.ofMillis(Long.MAX_VALUE)); // System.out.println("\tCount: " + records.count()); // System.out.println("\tPartitions: " + records.partitions()); records.forEach(record -> { sequence.add(record.key()); }); } System.out.println(sequence.size()); checkConsistency(sequence); Thread.sleep(2500); } } } 似乎是不可能的,因为没有org.mongodb.scala.bson.codecs.Macros函数使用参数。

但是,如果您知道如何制作Macro,就可以自己做。

编解码器似乎是具有3种方法Codec encodedecode的简单特征

getEncoderClass

这是一个简单的示例,但它给出了您可以做什么的想法。 使用宏生成简单的实例,并使用隐式函数将这些实例与函数组成,以自动获取正确的实例。