无法从安全的Kafka(EventStreams)读取Spark Streaming readStream

时间:2019-05-18 23:05:24

标签: apache-spark apache-kafka spark-streaming spark-streaming-kafka ibm-eventstreams

我试图将数据从程序发送到安全的Kafka集群(IBM Cloud上的EventStreams-Cloud Foundry Services),然后在我的消费者应用程序(即火花流)中尝试读取数据来自同一卡夫卡来源。

以下是我在制作人中设置的Properties

def getProperties: Properties = {
    val configs = new Properties()

    configs.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer")
    configs.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer")
    configs.put(ProducerConfig.CLIENT_ID_CONFIG, "kafka-java-console-sample-producer")
    configs.put(ProducerConfig.ACKS_CONFIG, "1")
    configs.put(CommonClientConfigs.BOOTSTRAP_SERVERS_CONFIG, "<url:port for 5 brokers>")
    configs.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, "SASL_SSL")
    configs.put(SaslConfigs.SASL_MECHANISM, "PLAIN")
    configs.put(SaslConfigs.SASL_JAAS_CONFIG, "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"token\" password=\"" + "<some apikey here>" + "\";")
    configs.put(SslConfigs.SSL_PROTOCOL_CONFIG, "TLSv1.2")
    configs.put(SslConfigs.SSL_ENABLED_PROTOCOLS_CONFIG, "TLSv1.2")
    configs.put(SslConfigs.SSL_ENDPOINT_IDENTIFICATION_ALGORITHM_CONFIG, "HTTPS")

    configs
}

这是我用来将数据发送到Kafka集群的代码:

    val producer = new KafkaProducer[String , String](getProperties)

    /** Reading the file line by line */

    for (line <- file.getLines) {
        /** Sending the lines to the $topic inside kafka cluster initialized inside $producer */
        val data = new ProducerRecord[String , String](topic , "key" , line)
        producer.send(data)
    }

我能够确认这确实将数据发送到Kafka集群,因为我能够使用IBM Cloud提供的Grafana指标查看进入集群的数据。

现在在我的Spark流式传输代码中,这是我尝试从kafka来源中读取的方法:

val df = spark.readStream
        .format("kafka")
        .option("subscribe", "raw_weather")
        .option("kafka.bootstrap.servers", "<url:port for the same 5 brokers>")
        .option("kafka.security.protocol", "SASL_SSL")
        .option("kafka.sasl.mechanism" , "PLAIN")
        .option("kafka.sasl.jaas.config", "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"token\" password=\"" + "<that same password given above>" + "\";")
        .option("kafka.ssl.protocol", "TLSv1.2")
        .option("kafka.ssl.enabled.protocols", "TLSv1.2")
        .option("kafka.ssl.endpoint.identification.algorithm", "HTTPS")
        .load()
        .selectExpr("CAST(value as STRING)")
        .as[String]

其次:

    val query= df.writeStream
        .outputMode(OutputMode.Append())
        .foreachBatch((df: DataFrame , id: Long) => {
            println(df.count() + " " + id)
        })
        .trigger(Trigger.ProcessingTime(1))
        .start()

    query.awaitTermination()

我不确定为什么,但是我的Spark Streaming根本无法从源中读取数据。当我启动Spark Streaming程序时,它会在输出中显示出来:

19/05/19 04:22:28 DEBUG SparkEnv: Using serializer: class org.apache.spark.serializer.JavaSerializer
19/05/19 04:22:28 INFO SparkEnv: Registering MapOutputTracker
19/05/19 04:22:28 INFO SparkEnv: Registering BlockManagerMaster
19/05/19 04:22:28 INFO SparkEnv: Registering OutputCommitCoordinator
0 0

当我再次运行生产者时,火花流仍然保留在0 0处。我不确定我在这里写错了什么。

编辑:保持消费者运行7个小时以上,但仍保持不变

0 个答案:

没有答案