Apache Flink:从Kafka读取数据作为字节数组

时间:2017-11-26 09:07:14

标签: deserialization apache-flink kafka-consumer-api flink-streaming

如何以String格式读取Kafka的数据?

我有一个实现,使用SimpleStringSchema()将事件读取为byte[],但我找不到将数据读取为 Properties properties = new Properties(); properties.setProperty("bootstrap.servers", "kafka1:9092"); properties.setProperty("zookeeper.connect", "zookeeper1:2181"); properties.setProperty("group.id", "test"); properties.setProperty("key.deserializer","org.apache.kafka.common.serialization.StringDeserializer"); properties.setProperty("value.deserializer", "org.apache.kafka.common.serialization.ByteArrayDeserializer"); properties.setProperty("auto.offset.reset", "earliest"); DataStream<byte[]> stream = env .addSource(new FlinkKafkaConsumer010<byte[]>("testStr", ? ,properties)); 的架构。

这是我的代码:

import pandas as pd
import matplotlib.pyplot as plt
df = pd.read_csv('train.csv')
df_grouped = df.groupby(['Survived','Sex','Pclass'])['Survived'].count()
df_grouped.unstack().plot(kind='bar',stacked=True,  colormap='Blues', grid=True, figsize=(13,5));

0 个答案:

没有答案