使用Flink运行的Apache Beam抛出NoSuchMethodError

时间:2018-07-24 13:51:44

标签: apache protocol-buffers apache-flink apache-beam

我得到了maven的Beam程序jar,我想用flink local运行它。 当我这样运行时,就可以了:

  

mvn exec:java -Dexec.mainClass = GroupbyTest -Dexec.args =“-runner = FlinkRunner \         --flinkMaster = localhost:6123 \         --filesToStage = target / beamTest-1.0-SNAPSHOT.jar“

但是当我使用flink run时,protobuf出现了问题:

  

./ bin / flink运行/home/maqy/Documents/beam_samples/beamTest/target/beamTest-1.0-SNAPSHOT.jar --runner = FlinkRunner

并且有日志:

使用“ hadoop类路径”的结果来扩展Hadoop类路径:/usr/local/hadoop-2.7.5/etc/hadoop:/usr/local/hadoop-2.7.5/share/hadoop/common/lib /:/usr/local/hadoop-2.7.5/share/hadoop/common/:/usr/local/hadoop-2.7.5/share/hadoop/hdfs:/usr/local/hadoop -2.7.5 / share / hadoop / hdfs / lib / :/ usr / local / hadoop-2.7.5 / share / hadoop / hdfs / :/ usr / local / hadoop-2.7.5 / share / hadoop / yarn / lib / :/ usr / local / hadoop-2.7.5 / share / hadoop / yarn / :/ usr / local / hadoop-2.7.5 / share / hadoop / mapreduce /lib/:/usr/local/hadoop-2.7.5/share/hadoop/mapreduce/:/usr/local/hadoop-2.7.5/contrib/capacity-scheduler/*.jar SLF4J:类路径包含多个SLF4J绑定。 SLF4J:在[jar:file:/home/maqy/%e4%b8%8b%e8%bd%bd/flink-1.4.0/lib/slf4j-log4j12-1.7.7.jar!/ org / slf4j中找到绑定/impl/StaticLoggerBinder.class] SLF4J:在[jar:file:/usr/local/hadoop-2.7.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]中找到绑定 SLF4J:有关说明,请参见http://www.slf4j.org/codes.html#multiple_bindings。 SLF4J:实际绑定的类型为[org.slf4j.impl.Log4jLoggerFactory] 集群配置:JobManager位于localhost / 127.0.0.1:6123的独立集群 使用地址localhost:6123连接到JobManager。 JobManager Web界面地址http://localhost:8081 开始执行程序


程序完成,但有以下异常:

java.lang.NoSuchMethodError:com.google.protobuf.Descriptors $ Descriptor.getOneofs()Ljava / util / List;     在com.google.protobuf.GeneratedMessageV3 $ FieldAccessorTable。(GeneratedMessageV3.java:1707)     在com.google.protobuf.AnyProto。(AnyProto.java:52)。     在org.apache.beam.model.pipeline.v1.RunnerApi。(RunnerApi.java:53271)     在org.apache.beam.model.pipeline.v1.RunnerApi $ Components $ TransformsDefaultEntryHolder。(RunnerApi.java:448)     在org.apache.beam.model.pipeline.v1.RunnerApi $ Components $ Builder.internalGetTransforms(RunnerApi.java:1339)     在org.apache.beam.model.pipeline.v1.RunnerApi $ Components $ Builder.getTransformsOrDefault(RunnerApi.java:1404)     在org.apache.beam.runners.core.construction.SdkComponents.registerPTransform(SdkComponents.java:81)     在org.apache.beam.runners.core.construction.PipelineTranslation $ 1.visitPrimitiveTransform(PipelineTranslation.java:87)     在org.apache.beam.sdk.runners.TransformHierarchy $ Node.visit(TransformHierarchy.java:670)     在org.apache.beam.sdk.runners.TransformHierarchy $ Node.visit(TransformHierarchy.java:662)     在org.apache.beam.sdk.runners.TransformHierarchy $ Node.visit(TransformHierarchy.java:662)     位于org.apache.beam.sdk.runners.TransformHierarchy $ Node.access $ 600(TransformHierarchy.java:311)     在org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:245)     在org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:458)     在org.apache.beam.runners.core.construction.PipelineTranslation.toProto(PipelineTranslation.java:59)     在org.apache.beam.runners.core.construction.PipelineTranslation.toProto(PipelineTranslation.java:53)     在org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate(FlinkPipelineExecutionEnvironment.java:91)     在org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:110)     在org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)     在org.apache.beam.sdk.Pipeline.run(Pipeline.java:297)     在GroupbyTest.main(GroupbyTest.java:100)     在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处     在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)     在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     在java.lang.reflect.Method.invoke(Method.java:498)     在org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:525)     在org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:417)     在org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:396)     在org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:802)     在org.apache.flink.client.CliFrontend.run(CliFrontend.java:282)     在org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1054)     在org.apache.flink.client.CliFrontend $ 1.call(CliFrontend.java:1101)     在org.apache.flink.client.CliFrontend $ 1.call(CliFrontend.java:1098)     在java.security.AccessController.doPrivileged(本机方法)     在javax.security.auth.Subject.doAs(Subject.java:422)     在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)     在org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)     在org.apache.flink.client.CliFrontend.main(CliFrontend.java:1098)

我想知道如何解决它,谢谢。

2 个答案:

答案 0 :(得分:0)

NoSuchMethodError通常表示版本问题。在类路径中找到了具有正确包和名称的类(否则您将看到ClassNotFoundException),但是它缺少某种方法。

如果类路径包含错误版本的依赖关系(通常是Google ProtoBuf),通常会发生这种情况。

答案 1 :(得分:0)

我已经通过使用Beam 2.6解决了这个问题。