带有Confluent Cloud Kafka连接问题的Spark结构流

时间:2019-12-19 03:24:10

标签: apache-spark pyspark apache-kafka spark-streaming

我正在PySpark中编写一个Spark结构的流应用程序,以从Confluent Cloud中的Kafka读取数据。 spark readstream()函数的文档太浅,在可选参数部分(特别是在auth机制部分)没有指定太多。我不确定哪个参数出错并导致连接崩溃。任何有Spark经验的人都可以帮助我开始此连接吗?

必需参数

> Consumer({'bootstrap.servers':
> 'cluster.gcp.confluent.cloud:9092',
>               'sasl.username':'xxx',
>               'sasl.password':  'xxx',
>               'sasl.mechanisms': 'PLAIN',
>               'security.protocol': 'SASL_SSL',
>     'group.id': 'python_example_group_1',
>     'auto.offset.reset': 'earliest' })

这是我的pyspark代码:

df = spark \
  .readStream \
  .format("kafka") \
  .option("kafka.bootstrap.servers", "cluster.gcp.confluent.cloud:9092") \
  .option("subscribe", "test-topic") \
  .option("kafka.sasl.mechanisms", "PLAIN")\
  .option("kafka.security.protocol", "SASL_SSL")\
  .option("kafka.sasl.username","xxx")\
  .option("kafka.sasl.password", "xxx")\
  .option("startingOffsets", "latest")\
  .option("kafka.group.id", "python_example_group_1")\
  .load()
display(df)

但是,我不断收到错误消息:

  

kafkashaded.org.apache.kafka.common.KafkaException:无法执行   建立卡夫卡消费者

DataBrick笔记本-用于测试

https://databricks-prod-cloudfront.cloud.databricks.com/public/4027ec902e239c93eaaa8714f173bcfc/4673082066872014/3543014086288496/1802788104169533/latest.html

文档

https://home.apache.org/~pwendell/spark-nightly/spark-branch-2.0-docs/latest/structured-streaming-kafka-integration.html

2 个答案:

答案 0 :(得分:1)

此错误表明您的Kafka使用者看不到JAAS配置。要解决此问题,请按照以下步骤包括JASS:

Step01:为下面的JAAS文件创建文件:/ home / jass / path

KafkaClient {
     com.sun.security.auth.module.Krb5LoginModule required
     useTicketCache=true
     renewTicket=true
     serviceName="kafka";
     };

Step02:根据以下conf参数在spark-submit中调用该JASS文件路径。

--conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=/home/jass/path"

完整的火花提交命令:

/usr/hdp/2.6.1.0-129/spark2/bin/spark-submit --packages com.databricks:spark-avro_2.11:3.2.0,org.apache.spark:spark-avro_2.11:2.4.0,org.apache.spark:spark-sql-kafka-0-10_2.11:2.2.0 --conf spark.ui.port=4055 --files /home/jass/path,/home/bdpda/bdpda.headless.keytab --conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=/home/jass/path" --conf "spark.driver.extraJavaOptions=-Djava.security.auth.login.config=/home/jass/path" pysparkstructurestreaming.py

Pyspark结构化流式示例代码:

from pyspark.sql import SparkSession
from pyspark.sql.functions import *
from pyspark.sql.types import *
from pyspark.streaming import StreamingContext
import time

#  Spark Streaming context :

spark = SparkSession.builder.appName('PythonStreamingDirectKafkaWordCount').getOrCreate()
sc = spark.sparkContext
ssc = StreamingContext(sc, 20)

#  Kafka Topic Details :

KAFKA_TOPIC_NAME_CONS = "topic_name"
KAFKA_OUTPUT_TOPIC_NAME_CONS = "topic_to_hdfs"
KAFKA_BOOTSTRAP_SERVERS_CONS = 'kafka_server:9093'

#  Creating  readstream DataFrame :

df = spark.readStream \
     .format("kafka") \
     .option("kafka.bootstrap.servers", KAFKA_BOOTSTRAP_SERVERS_CONS) \
     .option("subscribe", KAFKA_TOPIC_NAME_CONS) \
     .option("startingOffsets", "earliest") \
     .option("kafka.security.protocol","SASL_SSL")\
     .option("kafka.client.id" ,"Clinet_id")\
     .option("kafka.sasl.kerberos.service.name","kafka")\
     .option("kafka.ssl.truststore.location", "/home/path/kafka_trust.jks") \
     .option("kafka.ssl.truststore.password", "password_rd") \
     .option("kafka.sasl.kerberos.keytab","/home/path.keytab") \
     .option("kafka.sasl.kerberos.principal","path") \
     .load()

df1 = df.selectExpr( "CAST(value AS STRING)")

#  Creating  Writestream DataFrame :

df1.writeStream \
   .option("path","target_directory") \
   .format("csv") \
   .option("checkpointLocation","chkpint_directory") \
   .outputMode("append") \
   .start()

ssc.awaitTermination()

答案 1 :(得分:0)

我们需要指定kafka.sasl.jaas.config来添加Confluent Kafka SASL-SSL auth方法的用户名和密码。它的参数看起来有些奇怪,但是可以正常工作。

df = spark \
      .readStream \
      .format("kafka") \
      .option("kafka.bootstrap.servers", "pkc-43n10.us-central1.gcp.confluent.cloud:9092") \
      .option("subscribe", "wallet_txn_log") \
      .option("startingOffsets", "earliest") \
      .option("kafka.security.protocol","SASL_SSL") \
      .option("kafka.sasl.mechanism", "PLAIN") \
      .option("kafka.sasl.jaas.config", """kafkashaded.org.apache.kafka.common.security.plain.PlainLoginModule required username="xxx" password="xxx";""").load()
display(df)