pyspark流媒体与卡夫卡错误

时间:2018-03-02 11:05:15

标签: python spark-streaming mapr mapr-streams

我在MapR环境中使用带有kafka 0.9的spark 2.1.0版本。我正在尝试从Kafka主题读取到spark streaming。但是,当我运行Kafkautils createDirectStream命令时,我面临如下错误。

py4j.protocol.Py4JError: An error occurred while calling z:org.apache.spark.streaming.kafka09.KafkaUtilsPythonHelper.createDirectStream.
Trace:
    py4j.Py4JException: Method createDirectStream([class org.apache.spark.streaming.api.java.JavaStreamingContext, class
java.util.ArrayList, class java.util.HashMap]) does not exist

我正在运行的代码

from __future__ import print_function
import sys
from pyspark import SparkContext,SparkConf
from pyspark.streaming import StreamingContext
from pyspark.sql import SQLContext
from pyspark.streaming.kafka09 import KafkaUtils;

sqlContext = SQLContext(sc)
ssc = StreamingContext(sc, 3)
strLoc   = '/home/mapr/stream:info'
kafkaparams = {"zookeeper.connect" : "x.x.x.x:5181","metadata.broker.list" : "x.x.x.x:9092"}

strarg = KafkaUtils.createDirectStream(ssc,[strLoc],kafkaparams) <- Error when i run this command on pyspark shell

1 个答案:

答案 0 :(得分:0)

正在尝试完善您的代码。请尝试使用以下代码执行。

from pyspark.sql import SQLContext, SparkSession
from pyspark.streaming import StreamingContext
from confluent_kafka.avro.cached_schema_registry_client import CachedSchemaRegistryClient
from confluent_kafka.avro.serializer.message_serializer import MessageSerializer
from pyspark.streaming.kafka import KafkaUtils
import json

var_schema_url = 'http://localhost:8081'
var_kafka_parms_src = {"metadata.broker.list": 'localhost:9092'}

schema_registry_client = CachedSchemaRegistryClient(var_schema_url)
serializer = MessageSerializer(schema_registry_client)

spark = SparkSession.builder \
  .appName('Advertiser_stream') \
  .master('local[*]') \
  .getOrCreate()


def handler(message):
    records = message.collect()
    for record in records:
        <<You can process that data >>


sc = spark.sparkContext
ssc = StreamingContext(sc, 5)

kvs = KafkaUtils.createDirectStream(ssc, ['Topic-name'], var_kafka_parms_src,valueDecoder=serializer.decode_message)
kvs.foreachRDD(handler)

ssc.start()
ssc.awaitTermination()