java.lang.IllegalArgumentException:期望srcResourceIds和destResourceIds具有相同的方案,但是收到了hdfs

时间:2018-06-28 12:55:32

标签: hdfs beam

我正在尝试使用Apache Beam和Spark Runner从Kafka读取并写入HDFS。当我在本地运行它时,它可以成功运行,没有任何错误异常,但是每当我使用--output=hdfs:///myhdfslocation时,它就会低于预期

  

线程“主”中的异常   org.apache.beam.sdk.Pipeline $ PipelineExecutionException:   java.lang.IllegalArgumentException:期望srcResourceIds和   destResourceIds具有相同的方案,但是收到了hdfs,   算2018-06-28T12。           在org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:66)           在org.apache.beam.runners.spark.SparkPipelineResult.access $ 000(SparkPipelineResult.java:41)           在org.apache.beam.runners.spark.SparkPipelineResult $ StreamingMode.stop(SparkPipelineResult.java:163)           在org.apache.beam.runners.spark.SparkPipelineResult.offerNewState(SparkPipelineResult.java:198)           在org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:101)           在org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:87)           在org.apache.beam.examples.KafkaToHdfs.main(KafkaToHdfs.java:71)           在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处           在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)           在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)           在java.lang.reflect.Method.invoke(Method.java:498)           在org.apache.spark.deploy.SparkSubmit $ .org $ apache $ spark $ deploy $ SparkSubmit $$ runMain(SparkSubmit.scala:782)           在org.apache.spark.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit.scala:180)           在org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:205)           在org.apache.spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:119)           在org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)上,由以下原因引起:java.lang.IllegalArgumentException:期望srcResourceIds   和destResourceIds具有相同的方案,但是接收到hdfs,   算2018-06-28T12。           在org.apache.beam.sdk.repackaged.com.google.common.base.Preconditions.checkArgument(Preconditions.java:122)           在org.apache.beam.sdk.io.FileSystems.validateSrcDestLists(FileSystems.java:436)           在org.apache.beam.sdk.io.FileSystems.copy(FileSystems.java:281)           在org.apache.beam.sdk.io.FileBasedSink $ WriteOperation.moveToOutputFiles(FileBasedSink.java:755)           在org.apache.beam.sdk.io.WriteFiles $ FinalizeTempFileBundles $ FinalizeFn.process(WriteFiles.java:798)   18/06/28 05:44:35 INFO ShutdownHookManager:调用了关闭挂钩   18/06/28 05:44:35 INFO ShutdownHookManager:删除目录   / tmp / spark-2d51df24-6661-4c07-8181-21629882fe7d

我在找到类似的答案 Apache Beam:'Expect srcResourceIds and destResourceIds have the same scheme, but received hdfs,filename'

但是以某种方式无法解决我的问题。.如果有人遇到相同的问题并解决了..请帮助

0 个答案:

没有答案