获取ClassNotFound异常:在Hadoop中转储配置时

时间:2013-05-14 18:56:27

标签: hadoop

我正在尝试使用“hadoop in Action”一书中的一个简单程序,将本地文件系统中的一系列文件合并到hdfs中的一个文件中。代码段与书中提供的代码段相同。

import java.lang.*;
import java.util.*;
import java.io.*;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.Path;

public class PutMerge {

    public static void main(String[] args) throws IOException{
        Configuration conf = new Configuration();
        FileSystem hdfs = FileSystem.get(conf);
        FileSystem local = FileSystem.getLocal(conf);

        Path inputDir = new Path(args[0]); // First argument has the input directory 
        Path hdfsFile = new Path(args[1]); // Concatenated hdfs file name

        try {
            FileStatus[] inputFiles = local.listStatus(inputDir); // list of Local Files

            FSDataOutputStream out = hdfs.create(hdfsFile); // target file creation

            for (int i = 0; i<inputFiles.size; i++ {

                FSDataInputStream in = local.open(inputFiles[i].getPath());

                int bytesRead = 0;
                byte[] buff = new byte[256];

                while (bytesRead = (in.read(buff))>0) {
                    out.write(buff,0,bytesRead);
                }
                in.close();
            }
            out.close();

        } 
        catch(Exception e) {
            e.printStackTrace();
        }

    }
}

程序成功编译并在尝试运行时遇到以下异常

  

线程“main”中的异常java.lang.NoClassDefFoundError:   组织/阿帕奇/公/配置/配置           在org.apache.hadoop.metrics2.lib.DefaultMetricsSystem。(DefaultMetricsSystem.java:37)           在org.apache.hadoop.metrics2.lib.DefaultMetricsSystem。(DefaultMetricsSystem.java:34)           在org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)           在org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:217)           在org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:185)           at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:237)           在org.apache.hadoop.security.KerberosName。(KerberosName.java:79)           在org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:210)           在org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:185)           at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:237)           at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)           at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:468)           在org.apache.hadoop.fs.FileSystem $ Cache $ Key。(FileSystem.java:1519)           在org.apache.hadoop.fs.FileSystem $ Cache.get(FileSystem.java:1420)           在org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)           在org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)           在PutMerge.main(PutMerge.java:16)引起:java.lang.ClassNotFoundException:   org.apache.commons.configuration.Configuration           在java.net.URLClassLoader $ 1.run(URLClassLoader.java:366)           在java.net.URLClassLoader $ 1.run(URLClassLoader.java:355)           at java.security.AccessController.doPrivileged(Native Method)           在java.net.URLClassLoader.findClass(URLClassLoader.java:354)           at java.lang.ClassLoader.loadClass(ClassLoader.java:423)           at sun.misc.Launcher $ AppClassLoader.loadClass(Launcher.java:308)           at java.lang.ClassLoader.loadClass(ClassLoader.java:356)           ......还有17个

根据一些帖子的输入,我添加了commons包。我的类路径定义是

/usr/java/jdk1.7.0_21:/data/commons-logging-1.1.2/commons-logging-1.1.2.jar:/data/hadoop-1.1.2/hadoop-core-1.1.2.jar:/data/commons-logging-1.1.2/commons-logging-adapters-1.1.2.jar:/data/commons-logging-1.1.2/commons-logging-api-1.1.2.jar:.

有关为什么这不起作用的任何线索?

1 个答案:

答案 0 :(得分:1)

您没有在类路径中包含apache configuration

真的,除了hadoop本身你不需要包括很多东西。确保你自己带着hadoop运行你的jar。

> hadoop -jar myJar.jar