如何在Flink中将Avro文件写入S3?

时间:2019-07-10 22:06:42

标签: apache-flink

我想从kafka主题中读取流数据并以avro或镶木地板格式写入S3。数据流看起来像json字符串,但是我无法以avro或镶木地板格式转换并写入S3。

我找到了一些代码片段并尝试了

val sink = StreamingFileSink       .forBulkFormat(新路径(outputS3Path),ParquetAvroWriters.forReflectRecord(classOf [myClass]))       .build()

但是我在addSink上收到了“类型不匹配,预期的SinkFunction [String],实际:StreamingFileSink [TextOut]”

val流= env       .addSource(myConsumerSource)       .addSink(sink)

请帮助,谢谢!

2 个答案:

答案 0 :(得分:0)

答案 1 :(得分:0)

这是我用来将Parquet文件存储到本地系统中的代码。

import org.apache.avro.generic.GenericRecord
import org.apache.avro.{Schema, SchemaBuilder}
import org.apache.flink.core.fs.Path
import org.apache.flink.formats.parquet.avro.ParquetAvroWriters
import org.apache.flink.streaming.api.datastream.DataStreamSource
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
import org.apache.flink.streaming.api.functions.sink.filesystem.StreamingFileSink

val env = StreamExecutionEnvironment.getExecutionEnvironment()
env.enableCheckpointing(100)
val schema = SchemaBuilder
  .record("record")
  .fields()
  .requiredString("message")
  .endRecord()

val stream: DataStreamSource[GenericRecord] = env.fromCollection(genericRecordList)
val path = new Path(s"/tmp/flink-parquet-${System.currentTimeMillis()}")
val sink: StreamingFileSink[GenericRecord] = StreamingFileSink
  .forBulkFormat(path, ParquetAvroWriters.forGenericRecord(schema))
  .build()

stream.addSink(sink)
env.execute()