无法完成操作。异常不可重试

时间:2018-08-13 09:19:46

标签: hadoop apache-flink

下面是我的代码

public static void main(String[] args) throws Exception {
    final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();

    conf.set("mapreduce.input.fileinputformat.inputdir","hdfs://master.demo.com:8020/test/longs.txt");
    Job job = Job.getInstance(conf);
    org.apache.flink.api.java.hadoop.mapreduce.HadoopInputFormat<LongWritable, Text> hadoopIF = HadoopInputs.createHadoopInput(
            new CustomTextInputFormat(),
            LongWritable.class,
            Text.class,
            job
    );
    env.createInput(hadoopIF).map(new MapFunction<Tuple2<LongWritable,Text>, Integer>() {
        @Override
        public Integer map(Tuple2<LongWritable, Text> value) throws Exception {
            return Integer.parseInt(value.f1.toString());
        }
    }).collect();
}

运行以下命令

flink run -c com.dounine.flink.Application4 ./analysis-flink-dataset-hbase-user-1.0-SNAPSHOT.jar
  

设置HADOOP_CONF_DIR = / etc / hadoop / conf,因为未设置HADOOP_CONF_DIR。   SLF4J:类路径包含多个SLF4J绑定。   SLF4J:在[jar:file:/bigdata/flink-1.6.0/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]中找到绑定   SLF4J:在[jar:file:/bigdata/flink-1.6.0/lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]中找到绑定   SLF4J:有关说明,请参见http://www.slf4j.org/codes.html#multiple_bindings。   SLF4J:实际绑定的类型为[org.slf4j.impl.Log4jLoggerFactory]   开始执行程序   2018-08-13 17:03:56,456警告org.apache.hadoop.conf.Configuration-hdfs-site.xml:尝试覆盖最终参数:dfs.datanode.data.dir;无视。   2018-08-13 17:03:56,457警告org.apache.hadoop.conf.Configuration-hdfs-site.xml:尝试覆盖最终参数:dfs.datanode.failed.volumes.tolerated;无视。   2018-08-13 17:03:56,457警告org.apache.hadoop.conf.Configuration-hdfs-site.xml:尝试覆盖最终参数:dfs.namenode.http-address;无视。   2018-08-13 17:03:56,457警告org.apache.hadoop.conf.Configuration-hdfs-site.xml:尝试覆盖最终参数:dfs.namenode.name.dir;无视。   2018-08-13 17:03:56,457警告org.apache.hadoop.conf.Configuration-hdfs-site.xml:尝试覆盖最终参数:dfs.webhdfs.enabled;无视。   2018-08-13 17:03:56,461警告org.apache.hadoop.conf.Configuration-core-site.xml:尝试覆盖最终参数:fs.defaultFS;无视。   2018-08-13 17:03:56,553信息org.apache.hadoop.security.UserGroupInformation-使用密钥表文件/etc/security/keytabs/lake.keytab成功登录用户hbase/lake.dounine.com@dounine.com   2018-08-13 17:03:56,575警告org.apache.hadoop.conf.Configuration-hdfs-site.xml:尝试覆盖最终参数:dfs.datanode.data.dir;无视。   2018-08-13 17:03:56,575警告org.apache.hadoop.conf.Configuration-hdfs-site.xml:尝试覆盖最终参数:dfs.datanode.failed.volumes.tolerated;无视。   2018-08-13 17:03:56,575警告org.apache.hadoop.conf.Configuration-hdfs-site.xml:尝试覆盖最终参数:dfs.namenode.http-address;无视。   2018-08-13 17:03:56,575警告org.apache.hadoop.conf.Configuration-hdfs-site.xml:尝试覆盖最终参数:dfs.namenode.name.dir;无视。   2018-08-13 17:03:56,620警告org.apache.hadoop.conf.Configuration-hdfs-site.xml:尝试覆盖最终参数:dfs.webhdfs.enabled;无视。   2018-08-13 17:03:56,622警告org.apache.hadoop.conf.Configuration-core-site.xml:尝试覆盖最终参数:fs.defaultFS;无视。   2018-08-13 17:03:56,707 INFO org.apache.hadoop.hbase.zookeeper.ReadOnlyZKClient-将0x55342f40连接到storm4.starsriver.cn:2181,storm2.starsriver.cn:2181,storm3.starsriver.cn:2181与会话超时= 90000ms,重试6,重试间隔1000ms,keepAlive = 60000ms

     

2018-08-13 17:03:57,384警告org.apache.hadoop.hdfs.BlockReaderLocal-无法使用短路本地读取功能,因为无法加载libhadoop。

     

程序完成,但有以下异常:

org.apache.flink.client.program.ProgramInvocationException:无法检索执行结果。 (工作编号:6018b7c702305a63967096a341fdb239)     在org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:260)     在org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:486)     在org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:474)     在org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62)     在org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:816)     在org.apache.flink.api.java.DataSet.collect(DataSet.java:413)     在com.dounine.flink.Application4.main(Application4.java:156)     在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处     在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)     在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     在java.lang.reflect.Method.invoke(Method.java:498)     在org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)     在org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)     在org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:426)     在org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:804)     在org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:280)     在org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:215)     在org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1044)     在org.apache.flink.client.cli.CliFrontend.lambda $ main $ 11(CliFrontend.java:1120)     在java.security.AccessController.doPrivileged(本机方法)     在javax.security.auth.Subject.doAs(Subject.java:422)     在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)     在org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)     在org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1120) 原因:org.apache.flink.runtime.client.JobSubmissionException:无法提交JobGraph。     在org.apache.flink.client.program.rest.RestClusterClient.lambda $ submitJob $ 8(RestClusterClient.java:379)     在java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:870)     在java.util.concurrent.CompletableFuture $ UniExceptionally.tryFire(CompletableFuture.java:852)     在java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)     在java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977)     在org.apache.flink.runtime.concurrent.FutureUtils.lambda $ retryOperationWithDelay $ 5(FutureUtils.java:213)中     在java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:760)     在java.util.concurrent.CompletableFuture $ UniWhenComplete.tryFire(CompletableFuture.java:736)     在java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)     在java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:561)     在java.util.concurrent.CompletableFuture $ UniCompose.tryFire(CompletableFuture.java:929)     在java.util.concurrent.CompletableFuture $ Completion.run(CompletableFuture.java:442)     在java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)     在java.util.concurrent.ThreadPoolExecutor $ Worker.run(ThreadPoolExecutor.java:617)     在java.lang.Thread.run(Thread.java:745) 原因:java.util.concurrent.CompletionException:org.apache.flink.runtime.concurrent.FutureUtils $ RetryException:无法完成该操作。异常不可重试。     在java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)     在java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)     在java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:911)     在java.util.concurrent.CompletableFuture $ UniRelay.tryFire(CompletableFuture.java:899)     ...另外12个 原因:org.apache.flink.runtime.concurrent.FutureUtils $ RetryException:无法完成该操作。异常不可重试。     ...还有10个 原因:java.util.concurrent.CompletionException:org.apache.flink.runtime.rest.util.RestClientException:[作业提交失败。]     在java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)     在java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)     在java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:911)     在java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:953)     在java.util.concurrent.CompletableFuture $ UniCompose.tryFire(CompletableFuture.java:926)     ...另外4个 由以下原因引起:org.apache.flink.runtime.rest.util.RestClientException:[作业提交失败。]     在org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:310)     在org.apache.flink.runtime.rest.RestClient.lambda $ submitRequest $ 3(RestClient.java:294)     在java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:952)     ...还有5个

0 个答案:

没有答案