线程" main"中的例外情况org.apache.hadoop.mapred.InvalidInputException:输入路径不存在:

时间:2018-03-06 06:48:10

标签: hadoop mapreduce hadoop2

我已经创建了一个输入目录并将示例文件放入其中。我也创建了一个输出目录。但是在mapreduce程序执行时我得到了以下错误。这是我执行mapreduce的命令

bin/hdfs dfs -mkdir /input
bin/hdfs dfs -put /home/biswajit/sample.txt /input/
bin/hadoop jar /usr/local/hadoop/hadoop-2.9.0/share/hadoop/mapreduce/units.jar com.hadoop.ProcessUnits /input/sample.txt /output

错误是

Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: **hdfs://localhost:54310/home/biswajit/input/sample.txt**
    at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:294)
    at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:236)
    at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:322)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:341)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:333)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:202)
    at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
    at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1886)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
    at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:576)
    at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:571)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1886)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:571)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:562)
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:871)
    at com.hadoop.ProcessUnits.main(ProcessUnits.java:96)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:239)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:153)

1 个答案:

答案 0 :(得分:0)

HDFS上不存在

$HADOOP_HOME/input

$HADOOP_HOME本地文件系统上的bash变量。

您只为/input创建了一个目录,因此您可以mkdir使用变量的完整路径,如果您希望该命令按原样运行,或者您需要删除变量运行JAR文件

只要hdfs dfs -ls /input/*显示一些文件,那么该命令看起来就不错了,但我不确定那个Java类实际上期待什么作为输入

注意:

之间存在差异
 hdfs://localhost:54310/home/biswajit/input

并且

hdfs://localhost:54310/input

更具体地说,HDFS没有/home个文件夹,所以看起来你要么没有运行伪分布式集群,要么你自己创建了这个目录。