Apache Flink:Job抛出堆栈溢出错误

时间:2019-07-16 14:15:02

标签: apache-flink flink-streaming

我正在尝试在Apache Flink中执行此简单作业。

public class StreamingJob {
    public static void main(String[] args) throws Exception {
        final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);
        Properties inputProperties = new Properties();
        ObjectMapper mapper = new ObjectMapper();
        DataStream<String> eventStream = env
                .addSource(new FileSourceFunction("/path/to/file"));
        DataStream<ObjectNode> eventStreamObject = eventStream
                .map(x -> mapper.readValue(x, ObjectNode.class));
        DataStream<ObjectNode> eventStreamWithTime = eventStreamObject
                .assignTimestampsAndWatermarks(new AscendingTimestampExtractor<ObjectNode>() {
                    @Override
                    public long extractAscendingTimestamp(ObjectNode element) {
                        String data = element.get("ts").asText();
                        if(data.endsWith("Z")) {
                            data = data.substring(0, data.length() -1);
                        }
                        return LocalDateTime.parse(data).toEpochSecond(ZoneOffset.UTC);
                    }

                });
        eventStreamObject.print();
        env.execute("Local job");
    }

}

FileSourceFunction是自定义SourceFunction

public class FileSourceFunction implements SourceFunction<String> {

    /**
     * 
     */
    private static final long serialVersionUID = 1L;
    private String fileName;
    private volatile boolean isRunning = true;

    public FileSourceFunction(String fileName) {
        this.fileName = fileName;
    }

    @Override
    public void run(SourceContext<String> ctx) throws Exception {
        // TODO Auto-generated method stub
        try (BufferedReader br = new BufferedReader(new FileReader(fileName))) {
        try (Stream<String> stream = br.lines()) {
            Iterator<String> it = stream.iterator();
            while (isRunning && it.hasNext()) {
                synchronized (ctx.getCheckpointLock()) {
                    ctx.collect(it.next());
                }
            }
        }
        }
    }


    @Override
    public void cancel() {
        isRunning = false;
    }

}

我运行作业时会抛出一个StackOverFlowError。我正在使用apache Flink 1.8.1。

0 个答案:

没有答案