添加外部Jar时的Hadoop NoClassDefFoundError

时间:2016-01-31 02:30:06

标签: java hadoop jar noclassdeffounderror distributed-cache

我正在尝试使用外部Jar在Hadoop上运行MapReduce作业。我已使用joincopyFromLocal将Jar添加到HDFS。

在我的main方法中,我将Jar添加到DistributedCache。但是,当我运行MapReduce程序时,我在Mapper类中收到NoClassDefFoundError。我已经尝试过很多其他人遇到类似错误的解决方案,但是我还没有解决这个问题。任何指导都表示赞赏。

从主要方法:

/user/hduser/lib/

添加到分布式缓存:

Configuration conf = new Configuration();
String jarToAdd1 = "/user/hduser/lib/joda-time-2.9.1-no-tzdb.jar";
String jarToAdd2 = "/user/hduser/lib/joda-time-2.9.1.jar";
addJarToDistributedCache(jarToAdd1, conf);
addJarToDistributedCache(jarToAdd2, conf);
.........

Mapper出现错误:

private static void addJarToDistributedCache(String jarToAdd, Configuration conf) throws IOException {

        Path hdfsJar = new Path(jarToAdd);
        DistributedCache.addFileToClassPath(hdfsJar,conf);
    }

堆栈跟踪:

public static class Map1 extends Mapper<LongWritable, Text, IntWritable, UserData> {

Map<IntWritable, UserData> userLog = new HashMap<IntWritable, UserData>();

public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {

    String line = value.toString();                             
    StringTokenizer tokenizer = new StringTokenizer(line);      
    DateTimeFormatter formatter = DateTimeFormat.forPattern("yyyyddmm HH:mm:ss");    // *********ERROR HAPPENS HERE **********

0 个答案:

没有答案