我有一个使用Java 11 / spring和apache-avro的生产者,另一方面,我有一个使用scala 2.12和akka的消费者。如果我没有记错的话,那么Avro负载应该与语言无关。但是,当我尝试在Scala使用者中获取ConsumerRecord的值时,会收到ClassCastException:
java.lang.ClassCastException: class com.example.schema.DetailsSchema cannot be cast to class com.example.schema.scala.DetailsSchema (com.example.schema.DetailsSchema and com.example.schema.scala.DetailsSchema are in unnamed module of loader 'app'
我正在使用几个插件从我的模式中生成Java和Scala类
对于scala类,我只是在名称空间的末尾附加一个.scala包,以避免与Java生成的类发生冲突。
import io.confluent.kafka.serializers.KafkaAvroDeserializer
import org.apache.kafka.common.serialization._
val kafkaAvroSerDeConfig = Map[String, Any](
AbstractKafkaSchemaSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG -> "http://localhost:8081",
KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG -> true.toString
)
val kafkaConsumerSettings: ConsumerSettings[String, DetailsChema] = {
val kafkaAvroDeserializer = new KafkaAvroDeserializer()
kafkaAvroDeserializer.configure(kafkaAvroSerDeConfig.asJava, false)
val deserializer = kafkaAvroDeserializer.asInstanceOf[Deserializer[DetailsChema]]
ConsumerSettings(system, new StringDeserializer, deserializer)
.withBootstrapServers("localhost:9092")
.withGroupId("xt-collector")
.withProperties((ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "true"),
(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "5000"),
(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest"))
}
val control = Consumer
.plainSource(kafkaConsumerSettings, Subscriptions.topics("details"))
.map(msg => {
try {
val details = msg.value() // HERE I get the class cast exception
// java.lang.ClassCastException: class com.example.schema.DetailsSchema cannot be cast to class com.example.schema.scala.DetailsSchema (com.example.schema.DetailsSchema and com.example.schema.scala.DetailsSchema are in unnamed module of loader 'app'
log.info(s"Received ${details._id} to sink into mongo/elastic")
details
} catch {
case e: Exception => log.error("Error while deserializing details", e)
e.printStackTrace()
}
})
// .via(detailsCollection.flow) Just sink into mongo/elastic
.recover {
case t : Throwable => log.error("Error while storing in mongo/elastic", t)
}
.toMat(Sink.seq)(DrainingControl.apply)
.run()
我的问题是:是否可以在Java中生成Avro记录并在另一侧使用Scala案例类反序列化,如果两者均从同一架构生成并扩展SpecificRecordBase?