当我尝试通过使用kafka从购物系统发布/订阅数据,使用Beam设计数据流来编写自己的apache Beam演示时,请在flink上运行。我遇到了一个非常罕见的例外:
Caused by: java.lang.IncompatibleClassChangeError: Found interface org.apache.flink.streaming.api.operators.InternalTimer, but class was expected
at org.apache.beam.runners.flink.translation.wrappers.streaming.WindowDoFnOperator.fireTimer(WindowDoFnOperator.java:129)
at org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.onProcessingTime(DoFnOperator.java:704)
at org.apache.flink.streaming.api.operators.InternalTimerServiceImpl.onProcessingTime(InternalTimerServiceImpl.java:235)
at org.apache.flink.streaming.runtime.tasks.SystemProcessingTimeService$TriggerTask.run(SystemProcessingTimeService.java:285)
我的代码是:
package com.meikeland.dataflow;
import org.apache.beam.runners.flink.FlinkRunner;
import org.apache.beam.sdk.Pipeline;
import org.apache.beam.sdk.io.kafka.KafkaIO;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.apache.beam.sdk.transforms.*;
import org.apache.beam.sdk.transforms.windowing.*;
import org.apache.beam.sdk.values.KV;
import org.apache.kafka.common.serialization.LongDeserializer;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.apache.kafka.common.serialization.StringSerializer;
import org.joda.time.Duration;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class GameStats {
private static final Logger logger = LoggerFactory.getLogger(GameStats.class);
public static void main(String[] args) {
KFOptions options = PipelineOptionsFactory.fromArgs(args).as(KFOptions.class);
options.setRunner(FlinkRunner.class);
options.setStreaming(true);
logger.info("brokers address is: {}", options.getBrokers());
runDemoCount(options);
}
private static void runDemoCount(KFOptions options) {
Pipeline pipeline = Pipeline.create(options);
pipeline
// read order events from kafka
.apply("ConsumeKafka",
KafkaIO.<Long, String>read().withBootstrapServers(options.getBrokers()).withTopic("tracking.order.goods")
.withKeyDeserializer(LongDeserializer.class).withValueDeserializer(StringDeserializer.class)
.withLogAppendTime().withoutMetadata())
.apply(Values.create()).apply("ParseOrderInfo", ParDo.of(new ParseOrderInfoFn()))
.apply("SetTimestamp", WithTimestamps.of(OrderInfo::getCreatedAt))
.apply("ExtractOrderID", MapElements.via(new SimpleFunction<OrderInfo, Integer>() {
public Integer apply(OrderInfo o) {
logger.info("processed orderID: {}", o.getOrderID());
return o.getOrderID();
}
}))
// window
.apply("FixedWindowsOrderID",
Window.<Integer>into(FixedWindows.of(new Duration(1000 * 60)))
.triggering(AfterWatermark.pastEndOfWindow()
.withEarlyFirings(AfterProcessingTime.pastFirstElementInPane().plusDelayOf(new Duration(1000 * 60)))
.withLateFirings(AfterPane.elementCountAtLeast(1)))
.withAllowedLateness(new Duration(1000 * 60)).accumulatingFiredPanes())
.apply("Count", Count.<Integer>perElement()).apply("ToString", ParDo.of(new DoFn<KV<Integer, Long>, String>() {
@ProcessElement
public void processElement(@Element KV<Integer, Long> element, IntervalWindow window,
OutputReceiver<String> r) {
logger.info("the order is : {}, and count is : {}", element.getKey(), element.getValue());
r.output(String.format("interval :%s, Order ID: %d, Count :%d", window.start().toString(), element.getKey(),
element.getValue()));
}
})).apply("WriteToKafka", KafkaIO.<Void, String>write().withBootstrapServers(options.getBrokers())
.withTopic("streaming.order.count").withValueSerializer(StringSerializer.class).values());
pipeline.run().waitUntilFinish();
}
}
似乎错误在窗口中,但我无法弄清楚。而且我到处都是google,似乎没人遇到类似的错误。因此,我必须将一些小问题弄错了。请谁能救我。
答案 0 :(得分:0)
我遇到了同样的问题,并通过检查flink的版本是否与Beam兼容来解决了这个问题:
https://beam.apache.org/documentation/runners/flink/
就我而言,我有Beam 2.6和flink 1.5.4。
希望它能对您有所帮助。
关于, 阿里
答案 1 :(得分:0)
我也有这个问题,终于解决了。
如果您的项目取决于
“ org.apache.beam”%“ beam-runners-flink”%beamVersion
使用InternalTimer Class
我查看了org.apache.flink.streaming
的scala API文档,并且InternalTimer
在Interface
之后变成了Flink 1.6
。
要在Apache Beam FlinkRunner
之后将InternalTimer Interface
与Flink 1.6
正确使用,您的项目必须依赖
“ org.apache.beam”%“ beam-runners-flink-1.6”%beamVersion
或
“ org.apache.beam”%“ beam-runners-flink-1.7”%beamVersion
或
“ org.apache.beam”%“ beam-runners-flink-1.8”%beamVersion
一切都会很好