在Scala中反序列化Avro数据时遇到问题

时间:2018-07-01 17:38:17

标签: scala apache-kafka apache-flink avro

我正在Scala中构建一个Apache Flink应用程序,该应用程序从Kafka总线读取流数据,然后对其进行汇总操作。来自Kafka的数据为Avro格式,需要特殊的反序列化类。我发现了这个Scala类AvroDeserializationScehema(http://codegists.com/snippet/scala/avrodeserializationschemascala_saveveltri_scala):

package org.myorg.quickstart
import org.apache.avro.io.BinaryDecoder
import org.apache.avro.io.DatumReader
import org.apache.avro.io.DecoderFactory
import org.apache.avro.reflect.ReflectDatumReader
import org.apache.avro.specific.{SpecificDatumReader, SpecificRecordBase}
import org.apache.flink.api.common.typeinfo.TypeInformation
import org.apache.flink.api.java.typeutils.TypeExtractor
import org.apache.flink.api.common.serialization._
import java.io.IOException

class AvroDeserializationSchema[T](val avroType: Class[T]) extends DeserializationSchema[T] {
  private var reader: DatumReader[T] = null
  private var decoder : BinaryDecoder = null

  def deserialize(message: Array[Byte]): T = {
    ensureInitialized()
    try {
      decoder = DecoderFactory.get.binaryDecoder(message, decoder)
      reader.read(null.asInstanceOf[T], decoder)
    }
    catch {
      case e: IOException => {
        throw new RuntimeException(e)
      }
    }
  }

  def isEndOfStream(nextElement: T): Boolean = false


  def getProducedType: TypeInformation[T] = TypeExtractor.getForClass(avroType)

  private def ensureInitialized() {
    if (reader == null) {
      if (classOf[SpecificRecordBase].isAssignableFrom(avroType)) {
        reader = new SpecificDatumReader[T](avroType)
      }
      else {
        reader = new ReflectDatumReader[T](avroType)
      }
    }
  }
}

在我的流媒体课程中,我按如下方式使用它:

val stream = env
        .addSource(new FlinkKafkaConsumer010[String]("test", new 
AvroDeserializationSchema[DeviceData](Class[DeviceData]), properties))

其中DeviceData是在同一项目中定义的Scala案例类

/** Case class to hold the Device data. */
case class DeviceData(deviceId: String,
                    sw_version: String,
                    timestamp: String,
                    reading: Double
                   )

在编译StreamingKafkaClient.scala类时出现以下错误

Error:(24, 102) object java.lang.Class is not a value
        .addSource(new FlinkKafkaConsumer010[String]("test", new 
AvroDeserializationSchema[DeviceData](Class[DeviceData]), properties))

也尝试过

val stream = env
        .addSource(new FlinkKafkaConsumer010[String]("test", new 
AvroDeserializationSchema[DeviceData](classOf[DeviceData]), properties))

因此我得到了另一个错误:

Error:(21, 20) overloaded method constructor FlinkKafkaConsumer010 with alternatives:
  (x$1: java.util.regex.Pattern,x$2: org.apache.flink.streaming.util.serialization.KeyedDeserializationSchema[String],x$3: java.util.Properties)org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010[String] <and>
  (x$1: java.util.regex.Pattern,x$2: org.apache.flink.api.common.serialization.DeserializationSchema[String],x$3: java.util.Properties)org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010[String] <and>
  (x$1: java.util.List[String],x$2: org.apache.flink.streaming.util.serialization.KeyedDeserializationSchema[String],x$3: java.util.Properties)org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010[String] <and>
  (x$1: java.util.List[String],x$2: org.apache.flink.api.common.serialization.DeserializationSchema[String],x$3: java.util.Properties)org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010[String] <and>
  (x$1: String,x$2: org.apache.flink.streaming.util.serialization.KeyedDeserializationSchema[String],x$3: java.util.Properties)org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010[String] <and>
  (x$1: String,x$2: org.apache.flink.api.common.serialization.DeserializationSchema[String],x$3: java.util.Properties)org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010[String]
 cannot be applied to (String, org.myorg.quickstart.AvroDeserializationSchema[org.myorg.quickstart.DeviceData], java.util.Properties)
        .addSource(new FlinkKafkaConsumer010[String]("test", new AvroDeserializationSchema[DeviceData](classOf[DeviceData]), properties))

我是Scala的新手(这是我的第一个Scala程序),所以我知道我在这里缺少一些基本知识。当我尝试学习Scala时,有人可以指出我做错了什么。我的目的是基本上将来自Kafka的avro编码数据读入Flink,并对流数据进行一些操作。我找不到任何使用AvroDeserializationSchema类的示例,在我看来,这应该内置于Flink软件包中。

1 个答案:

答案 0 :(得分:0)

为了在Scala中获取类对象,您需要classOf[DeviceData],而不是Class[DeviceData]

new AvroDeserializationSchema[DeviceData](classOf[DeviceData])
  

我找不到AvroDeserializationSchema类用法的任何示例

I found one (in Java)

此外,在Flink 1.6发行版中,它们将添加此类而不是您从其他地方复制。 FLINK-9337 & FLINK-9338

如评论中所述,如果您想使用Confluent Avro Schema Registry而不是提供类类型see this answer,或参考上面的Github链接中的代码

此外,如果您运行的是Kafka 0.11+(或Confluent 3.3+),那么理想情况下,您应该将FlinkKafkaConsumer011与要反序列化的类一起使用

new FlinkKafkaConsumer011[DeviceData]