找不到工作jar文件。不能使用用户类

时间:2014-11-06 11:09:23

标签: hadoop

我试图在这个问题上浏览很多博客,但没有运气。我不确定我在这里做的错误是什么。有人可以帮我这个!

我的节目是:

package hadoopbook;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;


public class WordCount {

    //Mapper
    public static class WcMapperDemo extends Mapper<LongWritable, Text, Text, IntWritable>{

        Text MapKey = new Text();
        IntWritable MapValue = new IntWritable();

        public void map(LongWritable key, Text Value, Context Context) throws IOException, InterruptedException{
            String Record = Value.toString();
            String[] Words = Record.split(",");

            for (String Word:Words){
                MapKey.set(Word);
                MapValue.set(1);
                Context.write(MapKey, MapValue);
            }   
        }
    }

    //Reducer
    public static class WcReducerDemo extends Reducer<Text, IntWritable, Text, IntWritable>{

        IntWritable RedValue = new IntWritable();

        public void reduce(Text key, Iterable<IntWritable> Values, Context Context) throws IOException, InterruptedException{
            int sum = 0;

            for (IntWritable Value:Values){
                sum = sum + Value.get();
            }
            RedValue.set(sum);
            Context.write(key, RedValue);
        }
    }

    //Driver
    public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException {
        // TODO Auto-generated method stub

        Configuration Conf = new Configuration();

        Job Job = new Job(Conf, "Word Count Job");

        Job.setJarByClass(WordCount.class);
        Job.setMapperClass(WcMapperDemo.class);
        Job.setReducerClass(WcReducerDemo.class);

        Job.setMapOutputKeyClass(Text.class);
        Job.setMapOutputValueClass(IntWritable.class);

        Job.setOutputKeyClass(Text.class);
        Job.setOutputValueClass(IntWritable.class);

        FileInputFormat.addInputPath(Job, new Path (args[0]));
        FileOutputFormat.setOutputPath(Job, new Path (args[1]));

        System.exit(Job.waitForCompletion(true) ? 0:1);
    }
}

我得到的错误就是执行这个命令:

[cloudera@localhost Desktop]$ sudo -u hdfs hadoop jar WordCount.jar hadoopbook.WordCount /user/cloudera/InputFiles/Small/Words.txt /user/cloudera/Output

我的桌面文件夹上有WordCount.jar文件!

[cloudera@localhost Desktop]$ sudo -u hdfs hadoop jar WordCount.jar hadoopbook.WordCount /user/cloudera/InputFiles/Small/Words.txt /user/cloudera/Output
14/11/06 02:56:15 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
14/11/06 02:56:15 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
14/11/06 02:56:15 INFO input.FileInputFormat: Total input paths to process : 1
14/11/06 02:56:16 INFO mapred.JobClient: Running job: job_201411040035_0017
14/11/06 02:56:17 INFO mapred.JobClient:  map 0% reduce 0%
14/11/06 02:56:29 INFO mapred.JobClient: Task Id : attempt_201411040035_0017_m_000000_0, Status : FAILED
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hadoopbook.WordCount$WcMapperDemo not found
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1617)
    at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:191)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:631)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.ClassNotFoundException: Class hadoopbook.WordCount$WcMapperDemo not found
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1523)
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1615)
    ... 8 more

14/11/06 02:56:36 INFO mapred.JobClient: Task Id : attempt_201411040035_0017_m_000000_1, Status : FAILED
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hadoopbook.WordCount$WcMapperDemo not found
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1617)
    at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:191)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:631)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.ClassNotFoundException: Class hadoopbook.WordCount$WcMapperDemo not found
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1523)
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1615)
    ... 8 more

14/11/06 02:56:42 INFO mapred.JobClient: Task Id : attempt_201411040035_0017_m_000000_2, Status : FAILED
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class hadoopbook.WordCount$WcMapperDemo not found
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1617)
    at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:191)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:631)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.ClassNotFoundException: Class hadoopbook.WordCount$WcMapperDemo not found
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1523)
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1615)
    ... 8 more

14/11/06 02:56:54 INFO mapred.JobClient: Job complete: job_201411040035_0017
14/11/06 02:56:54 INFO mapred.JobClient: Counters: 7
14/11/06 02:56:54 INFO mapred.JobClient:   Job Counters 
14/11/06 02:56:54 INFO mapred.JobClient:     Failed map tasks=1
14/11/06 02:56:54 INFO mapred.JobClient:     Launched map tasks=4
14/11/06 02:56:54 INFO mapred.JobClient:     Data-local map tasks=4
14/11/06 02:56:54 INFO mapred.JobClient:     Total time spent by all maps in occupied slots (ms)=33819
14/11/06 02:56:54 INFO mapred.JobClient:     Total time spent by all reduces in occupied slots (ms)=0
14/11/06 02:56:54 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
14/11/06 02:56:54 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0

我刚试过这个,但没有运气!我确保我的文件在桌面上!

[cloudera@localhost ~]$ sudo -u hdfs hadoop jar /home/cloudera/Desktop/WordCount.jar hadoopbook.WordCount /user/cloudera/InputFiles/Small/Words.txt /user/cloudera/Output
Exception in thread "main" java.io.IOException: Error opening job jar: /home/cloudera/Desktop/WordCount.jar
    at org.apache.hadoop.util.RunJar.main(RunJar.java:135)
Caused by: java.util.zip.ZipException: error in opening zip file
    at java.util.zip.ZipFile.open(Native Method)
    at java.util.zip.ZipFile.<init>(ZipFile.java:127)
    at java.util.jar.JarFile.<init>(JarFile.java:135)
    at java.util.jar.JarFile.<init>(JarFile.java:72)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:133)

1 个答案:

答案 0 :(得分:0)

尝试以下命令

>hadoop jar <<absolute path to WordCount.jar>> hadoopbook.WordCount /user/cloudera/InputFiles/Small/Words.txt /user/cloudera/Output