线程“main”中的异常java.lang.NoClassDefFoundError:org / apache / hadoop / util / Tool

时间:2014-04-04 13:22:59

标签: hadoop mapreduce

I get below error when i package (jar) and run my defaulthadoopjob. 

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/Tool
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.Tool
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    ... 12 more
Could not find the main class: DefaultHadoopJobDriver. Program will exit.


Commands used to build Jar. 

# jar -cvf dhj.jar 
# hadoop -jar dhj.jar DefaultHadoopJobDriver

The above command gave me error "Failed to load Main-Class manifest attribute from dhj.jar"

rebuilt jar with manifest using below command

jar -cvfe dhj.jar DefaultHadoopJobDriver。
   hadoop -jar dhj.jar DefaultHadoopJobDriver - 这返回了我上面报告的原始错误消息。

我的Hadoop作业有单个类“DefaultHoopJobDrive”,它扩展了Configures并实现了Tool,并且只运行方法作为Job创建和输入路径,outpurpath集的代码。 Aslo I.m使用新的API。

I'm running hadoop 1.2.1 and the Job works fine from eclipse.

This might be something to do with the classpath. Please help.

3 个答案:

答案 0 :(得分:5)

要执行该jar,您不必提供hadoop -jar。命令是这样的:

 hadoop jar <jar> [mainClass] args...

如果此jar再次获得java.lang.ClassNotFoundException异常,那么您可以使用:

hadoop classpath

命令查看hadoop安装类路径中是否存在hadoop-core-1.2.1.jar

仅供参考,如果此列表中没有,您必须将此jar添加到hadoop lib目录。

答案 1 :(得分:1)

尝试使用hadoop的lib文件夹中的所有hadoop jar构建你的hadoop java代码。 在这种情况下,您缺少hadoop-core中存在的hadoop util类 - * .jar

可以在jar中构建代码时指定Classpath,也可以使用以下命令将其外部化

    hadoop -cp <path_containing_hadoop_jars> -jar <jar_name>

答案 2 :(得分:0)

如果有人使用Maven并在这里登陆:可以通过要求Maven在父项目的jar本身中包含它需要的任何jar来解决依赖性问题。这样,Hadoop就不必在别处查找依赖项 - 它可以在那里找到它们。这是如何做到这一点:  1.转到pom.xml

  1. <project>标记中添加一个名为<build>

  2. 的标记
  3. 将以下内容添加到您的<build></build>部分:

    <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>1.7.1</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <artifactSet>
                            <excludes>
                                <exclude>org.slf4j:slf4j-api</exclude>
                                <exclude>junit:junit</exclude>
                                <exclude>jmock:jmock</exclude>
                                <exclude>xml-apis:xml-apis</exclude>
                                <exclude>org.testng:testng</exclude>
                                <exclude>org.mortbay.jetty:jetty</exclude>
                                <exclude>org.mortbay.jetty:jetty-util</exclude>
                                <exclude>org.mortbay.jetty:servlet-api-2.5</exclude>
                                <exclude>tomcat:jasper-runtime</exclude>
                                <exclude>tomcat:jasper-compiler</exclude>
                                <exclude>org.apache.hadoop:hadoop-core</exclude>
                                <exclude>org.apache.mahout:mahout-math</exclude>
                                <exclude>commons-logging:commons-logging</exclude>
                                <exclude>org.mortbay.jetty:jsp-api-2.1</exclude>
                                <exclude>org.mortbay.jetty:jsp-2.1</exclude>
                                <exclude>org.eclipse.jdt:core</exclude>
                                <exclude>ant:ant</exclude>
                                <exclude>org.apache.hadoop:avro</exclude>
                                <exclude>jline:jline</exclude>
                                <exclude>log4j:log4j</exclude>
                                <exclude>org.yaml:snakeyaml</exclude>
                                <exclude>javax.ws.rs:jsr311-api</exclude>
                                <exclude>org.slf4j:jcl-over-slf4j</exclude>
                                <exclude>javax.servlet:servlet-api</exclude>
                            </excludes>
                        </artifactSet>
                        <filters>
                            <filter>
                                <artifact>*:*</artifact>
                                <excludes>
                                    <exclude>META-INF/jruby.home</exclude>
                                    <exclude>META-INF/license</exclude>
                                    <exclude>META-INF/maven</exclude>
                                    <exclude>META-INF/services</exclude>
                                </excludes>
                            </filter>
                        </filters>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    
  4. 现在再次构建项目,并使用正常的hadoop java my.jar ...命令运行。现在不应该对依赖关系哭泣。希望这有帮助!