Hadoop MapReduce程序错误

时间:2015-04-10 07:49:40

标签: hadoop mapreduce bigdata hadoop2

我的第一个wordcount MapReduce程序中出现以下错误:

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    cat: `/home/Kumar/DEV/Eclipse/eclipse/Workspace/MyFirstMapReduce/Files/input/': No such file or directory

OS& Hadoop版本

CentOS
Release 6.6 (Final)
Kernel Linux 2.6.32-504.12.2.el6.x86_64
GNOME 2.28.2

Hadoop 2.6.0 64bit version

Bashrc配置

export JAVA_HOME ="/usr/lib/jvm/jdk1.8.0"
export PATH =$PATH:$JAVA_HOME/bin"
export HADOOP_CLASSPATH=$JAVA_HOME/lib/tools.jar

export HADOOP_INSTALL="/home/Kumar/DEV/HDS/hadoop"
export PATH=$PATH:$HADOOP_INSTALL/bin  
export PATH=$PATH:$HADOOP_INSTALL/sbin  
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL  
export HADOOP_COMMON_HOME=$HADOOP_INSTALL  
export HADOOP_HDFS_HOME=$HADOOP_INSTALL  
export YARN_HOME=$HADOOP_INSTALL  

正在运行守护进程

3763 SecondaryNameNode
4406 ResourceManager
22264 org.eclipse.equinox.launcher_1.3.0.v20140415-2008.jar
17736 Jps
30584 org.eclipse.equinox.launcher_1.3.0.v20140415-2008.jar
4697 NodeManager
3611 DataNode
4059 NameNode

我从Apache网站下面复制了WordCount程序,并按照此tutorial中给出的步骤进行操作。当我编译WordCount.java时,它创建了3个类文件:

hadoop com.sun.tools.javac.Main /home/Kumar/DEV/Eclipse/eclipse/Workspace/MyFirstMapReduce/src/WordCount.java 

WordCount.class
WordCount$IntSumReducer.class
WordCount$TokenizerMapper.class

当我运行HDFS命令时,它会抛出警告信息和文件或目录未找到消息,即使文件&导演存在

hdfs dfs -cat /home/Kumar/DEV/Eclipse/eclipse/Workspace/MyFirstMapReduce/Files/input/file1

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
cat: `/home/Kumar/DEV/Eclipse/eclipse/Workspace/MyFirstMapReduce/Files/input/file1': No such file or directory


hdfs dfs -cat /home/Kumar/DEV/Eclipse/eclipse/Workspace/MyFirstMapReduce/Files/input/

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
cat: `/home/Kumar/DEV/Eclipse/eclipse/Workspace/MyFirstMapReduce/Files/input/': No such file or directory

2 个答案:

答案 0 :(得分:0)

第一条消息只是一个警告,并不麻烦。

发生第二条消息是因为hadoop -fs将在hdfs上查找,而不是在本地文件系统上。您可能想要检查文件是否存在ls

答案 1 :(得分:0)

  1. 您尝试使用的文件位于本地Linux文件系统上,而不是HDFS上。 所以使用可以使用简单的命令,如: cat /home/Kumar/DEV/Eclipse/eclipse/Workspace/MyFirstMapReduce/Files/input/file1
  2. 如果您的MapReduce程序需要HDFS上的输入文件,则将相同的文件放在作业中的HDFS和HDFS路径上。您可以使用此命令将文件放在hdfs上: hadoop fs -put <input-file-path-on-linux> <path-on-hdfs>

    您指的教程使用Mapreduce的旧API。您可以在此处查看当前Hadoop版本的相同教程:http://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapReduceTutorial.html

    希望这会对你有所帮助。