我编写了一个自定义Kafka Connect插件来读取文件并将avro消息发布到主题。我正在使用LineNumberReader
来读取文件中的每一行,并将记录与模式结合并发布到主题。仅当我在poll方法中返回SourceRecord
列表时才会发布记录。有时,如果它是一个大文件,我会遇到Out of Memory问题。我已经研究了这个问题并阅读了很多关于SO的博客和问题,但似乎没有什么对我有用。我在commit()
课程中了解了SourceTask
方法,但不是很清楚。有人遇到过类似的问题吗?以下代码段:
@Override
public List<SourceRecord> poll() throws InterruptedException {
final List<SourceRecord> results = new ArrayList<>();
//Get reader based on certain params:
FileReader reader = myReader;
while (reader.hasNext()) {
results.add(getSourceRecord(file, reader.currentOffset(), reader.next()));
}
return results;
}
private getSourceRecord convert(String fileName, Offset offset, Struct struct) {
return new SourceRecord(
new HashMap<String, Object>() {
{
put("path", fileName);
}
},
Collections.singletonMap("offset", offset.getRecordOffset()),
config.getWriteTopic(),
struct.schema(),
struct
);
}
public boolean hasNext() {
if (currentLine != null) {
return true;
} else if (finished) {
return false;
} else {
try {
while (true) {
String line = reader.readLine();
offset.setOffset(reader.getLineNumber());
if (line == null) {
finished = true;
return false;
}
currentLine = line;
//Removed for brevity
}
}
}