无法将字节[]转换为scala中的字符串

时间:2016-12-20 11:52:59

标签: spark-streaming spark-dataframe kafka-consumer-api kafka-producer-api spark-avro

**我尝试从kafka传输数据并将其转换为数据框。followed this link

但是当我运行生产者和消费者应用程序时,这是我控制台上的输出。**

  

(0,[B @ 370ed56a](1,[B @ 2edd3e63)(2,[B @ 3ba2944d](3,[B @ 2eb669d1)   (4,[B @ 49dd304c](5,[B @ 4f6af565](6,[B @ 7714e29e]

这实际上是kafka制作人的输出,在推送消息之前,主题是空的。

以下是生产者代码段:

Properties props = new Properties();
props.put("bootstrap.servers", "##########:9092");
props.put("key.serializer",
        "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer",
        "org.apache.kafka.common.serialization.ByteArraySerializer");
props.put("producer.type", "async");
Schema.Parser parser = new Schema.Parser();
Schema schema = parser.parse(EVENT_SCHEMA);
Injection<GenericRecord, byte[]> records = GenericAvroCodecs.toBinary(schema);

KafkaProducer<String, byte[]> producer = new KafkaProducer<String, byte[]>(props);
for (int i = 0; i < 100; i++) {
    GenericData.Record avroRecord = new GenericData.Record(schema);
    setEventValues(i, avroRecord);
    byte[] messages = records.apply(avroRecord);
    ProducerRecord<String, byte[]> producerRecord = new ProducerRecord<String, byte[]>(
            "topic", String.valueOf(i),messages);
    System.out.println(producerRecord);
    producer.send(producerRecord);
}

其输出为:

  

key = 0,value = [B @ 680387a key = 1,value = [B @ 32bfb588 key = 2,   value = [B @ 2ac2e1b1 key = 3,value = [B @ 606f4165 key = 4,value = [B @ 282e7f59

这是我用scala编写的消费者代码段

"group.id" -> "KafkaConsumer",
"zookeeper.connection.timeout.ms" -> "1000000"

val topicMaps = Map("topic" -> 1)
val messages = KafkaUtils.createStream[String, Array[Byte], StringDecoder, DefaultDecoder](ssc, kafkaConf, topicMaps, StorageLevel.MEMORY_ONLY_SER)
messages.print()

我在createStream()中尝试过StringDecoder和DefaultDecoder。我确信,生产者和消费者彼此遵守。 来自任何人的帮助吗?

0 个答案:

没有答案