缺少Hadoop 2.2.0 jar文件

时间:2013-12-30 03:45:01

标签: hadoop

我安装了hadoop-2.2.0,并且正在尝试运行与其捆绑在一起的mapreduce示例代码。但是,它每次都使用ClassNotFoundException失败,我发现的原因是因为hadoop.sh文件中设置了什么。以下是sh文件中的内容,并且安装中没有捆绑任何类文件。我确实看到它们存在于源中。

if [ "$COMMAND" = "fs" ] ; then
  CLASS=org.apache.hadoop.fs.FsShell
elif [ "$COMMAND" = "version" ] ; then
  CLASS=org.apache.hadoop.util.VersionInfo
elif [ "$COMMAND" = "jar" ] ; then
  CLASS=org.apache.hadoop.util.RunJar
elif [ "$COMMAND" = "checknative" ] ; then
  CLASS=org.apache.hadoop.util.NativeLibraryChecker
elif [ "$COMMAND" = "distcp" ] ; then
  CLASS=org.apache.hadoop.tools.DistCp
  CLASSPATH=${CLASSPATH}:${TOOL_PATH}
elif [ "$COMMAND" = "daemonlog" ] ; then
  CLASS=org.apache.hadoop.log.LogLevel
elif [ "$COMMAND" = "archive" ] ; then
  CLASS=org.apache.hadoop.tools.HadoopArchives
  CLASSPATH=${CLASSPATH}:${TOOL_PATH}
elif [[ "$COMMAND" = -*  ]] ; then
    # class and package names cannot begin with a -
    echo "Error: No command named \`$COMMAND' was found. Perhaps you meant \`hadoop ${COMMAND#-}'"
    exit 1
else
  CLASS=$COMMAND

这是错误:

Exception in thread "main" java.lang.NoClassDefFoundError: org.apache.hadoop.util.RunJar
   at gnu.java.lang.MainThread.run(libgcj.so.13)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.RunJar not found in gnu.gcj.runtime.SystemClassLoader{urls=[file:/usr/local/hadoop-2.2.0/etc/hadoop/,file:/usr/local/hadoop-2.2.0/share/hadoop/hdfs/], parent=gnu.gcj.runtime.ExtensionClassLoader{urls=[], parent=null}}
   at java.net.URLClassLoader.findClass(libgcj.so.13)
   at gnu.gcj.runtime.SystemClassLoader.findClass(libgcj.so.13)
   at java.lang.ClassLoader.loadClass(libgcj.so.13)
   at java.lang.ClassLoader.loadClass(libgcj.so.13)
   at gnu.java.lang.MainThread.run(libgcj.so.13)

1 个答案:

答案 0 :(得分:1)

我终于弄明白了这个问题。 YARN(用于mapreduce)和DFS进程需要在后台运行才能运行任何hadoop作业。我错过了那些对Hadoop而言是n00b的人。要启动这两个进程,请在命令窗口中键入start-yarnstart-dfs。他们每个都启动2个控制台窗口,并吐出大量的诊断信息。

相关问题