筛选kafka消息时,火花作业失败

时间:2019-06-10 13:38:59

标签: scala apache-spark apache-kafka

需要通过检查消息是否具有必填字段来检查发送到Kafka的事件消息是否有效,如果需要,则将数据推送到Elasticsearch。这是我的方法:

object App {

  val parseJsonStream = (inStream: RDD[String]) => {
    inStream.flatMap(json => {
      try {
        val parsed = parse(json)
        Option(parsed)
      } catch {
        case e: Exception => System.err.println("Exception while parsing JSON: " + json)
          e.printStackTrace()
          None
      }
    }).flatMap(v => {
      if (v.values.isInstanceOf[List[Map[String, Map[String, Any]]]])
        v.values.asInstanceOf[List[Map[String, Map[String, Any]]]]
      else if (v.values.isInstanceOf[Map[String, Map[String, Any]]])
        List(v.values.asInstanceOf[Map[String, Map[String, Any]]])
      else {
        System.err.println("EVENT WRONG FORMAT: " + v.values)
        List()
      }
    }).flatMap(mapa => {
      val h = mapa.get("header")
      val b = mapa.get("body")
      if (h.toSeq.toString.contains("session.end") && !b.toSeq.toString.contains("duration")) {
        System.err.println("session.end HAS NO DURATION FIELD!")
        None
      }
      else if (h.isEmpty || h.get.get("userID").isEmpty || h.get.get("timestamp").isEmpty) {
        throw new Exception("FIELD IS MISSING")
        None
      }
      else {
        Some(mapa)
      }
    })
  }

  val kafkaStream: InputDStream[ConsumerRecord[String, String]] = KafkaUtils.createDirectStream[String, String](
    ssc, PreferBrokers, Subscribe[String, String](KAFKA_EVENT_TOPICS, kafkaParams)
  )
  val kafkaStreamParsed = kafkaStream.transform(rdd => {
    val eventJSON = rdd.map(_.value)
    parseJsonStream(eventJSON)
  }
  )

  val esEventsStream = kafkaStreamParsed.map(addElasticMetadata(_))

  try {
    EsSparkStreaming.saveToEs(esEventsStream, ELASTICSEARCH_EVENTS_INDEX + "_{postfix}" + "/" + ELASTICSEARCH_TYPE, Map("es.mapping.id" -> "docid")
    )
  } catch {
    case e: Exception =>
      EsSparkStreaming.saveToEs(esEventsStream, ELASTICSEARCH_FAILED_EVENTS)
      e.printStackTrace()
  }
}

我猜有人正在发送无效事件(这就是为什么我仍然要执行此检查的原因),但是Spark job不会跳过该消息,但会失败并显示消息:

  

用户类引发异常:org.apache.spark.SparkException:作业   由于阶段失败而中止:阶段6.0中的任务2失败4次,大多数   最近失败:在阶段6.0中丢失了任务2.3(TID 190,xxx.xxx.host.xx,   执行程序3):java.lang.Exception:字段缺失

如何防止它崩溃并仅跳过消息呢?它是YARN应用程序,使用:

Spark 2.3.1
Spark-streaming-kafka-0-10_2.11:2.3.1
Scala 2.11.8

1 个答案:

答案 0 :(得分:3)

代替此

unstack(df, content ~ tag)

只需执行

throw new Exception("FIELD IS MISSING")
None

引发此异常将导致您的程序终止。