使用Apache Flink和Lagom的java.io.NotSerializableException

时间:2017-04-23 12:01:43

标签: scala flink-streaming lagom flink-cep

我正在Lagom的微服务实施中编写Flink CEP程序。我的FLINK CEP程序在简单的scala应用程序中运行得非常好。但是当我在Lagom服务实现中使用此代码时,我收到以下异常

enter image description here

Lagom服务实施

override def start =  ServiceCall[NotUsed, String] {

val env = StreamExecutionEnvironment.getExecutionEnvironment

var executionConfig = env.getConfig
env.setParallelism(1)
executionConfig.disableSysoutLogging()


var topic_name="topic_test"

var props= new Properties
props.put("bootstrap.servers", "localhost:9092")
props.put("acks","all");
props.put("key.serializer","org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer","org.apache.kafka.common.serialization.ByteArraySerializer");
props.put("block.on.buffer.full","false");

val kafkaSource = new FlinkKafkaConsumer010 (topic_name, new KafkaDeserializeSchema , props)

val stream =  env.addSource(kafkaSource)


val deliveryPattern = Pattern.begin[XYZ]("begin").where(_.ABC == 5)
  .next("next").where(_.ABC == 10).next("end").where(_.ABC==5)

val deliveryPatternStream = CEP.pattern(stream, deliveryPattern)

def selectFn(pattern : collection.mutable.Map[String, XYZ]): String  = {
  val startEvent = pattern.get("begin").get
  val nextEvent = pattern.get("next").get
  "Alert Detected"


}

val deliveryResult =deliveryPatternStream.select(selectFn(_)).print()

env.execute("CEP")
req=> Future.successful("Done")

 }



}

我不明白如何解决这个问题。

0 个答案:

没有答案