我在Centos 6中有3个节点(2个从节点),这些节点配置了Hadoop 3.1.2。每个服务(节点管理器和数据节点)都工作正常。
我正在尝试通过以下命令启动一个简单的jar来计算hadoop中访问日志文件中的单词数:
hadoop jar ContarPalabras.jar ContarPalabras /prueba/access-log /salidaLog
这是我要执行的类:
import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;
/**
* * <p>Contamos el número de palabras que aparecen en un documento usando MapReduce. El código tiene un mapper,
* reducer, y el programa principal.</p>
* *
* */
public class ContarPalabras {
/**
* * <p>
* * El mapper extiende el nterface org.apache.hadoop.mapreduce.Mapper. Al ejecutar Hadoop ,
* * se recibe cada línea del fichero de entrada como input
* La función map devuelve por cada palabra (word) un (word,1) como salida. </p>
* */
public static class TokenizerMapper
extends Mapper<Object, Text, Text, IntWritable>{
private final static IntWritable one = new IntWritable(1);
private Text word = new Text();
public void map(Object key, Text value, Context context
) throws IOException, InterruptedException {
StringTokenizer itr = new StringTokenizer(value.toString());
while (itr.hasMoreTokens()) {
word.set(itr.nextToken());
context.write(word, one);
}
}
}
/*
* La función reduce recibe todos los valores que tienen la misma clave como entrada y devuelve la clave y el número de
* ocurrencias como salida
* */
public static class IntSumReducer
extends Reducer<Text,IntWritable,Text,IntWritable> {
private IntWritable result = new IntWritable();
public void reduce(Text key, Iterable<IntWritable> values,
Context context
) throws IOException, InterruptedException {
int sum = 0;
for (IntWritable val : values) {
sum += val.get();
}
result.set(sum);
context.write(key, result);
}
}
/**
* * La entrada es cualquier fichero
* */
public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();
if (otherArgs.length != 2) {
System.err.println("Uso: ContarPalabras <in> <out>");
System.exit(2);
}
Job job = Job.getInstance(conf, "ContarPalabras");
job.setJarByClass(ContarPalabras.class);
job.setMapperClass(TokenizerMapper.class);
/**** Dejarlo tal cual ****/
//job.setCombinerClass(IntSumReducer.class);
job.setReducerClass(IntSumReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
当我运行此命令时,一切正常,直到第一个地图运行为止...我收到以下错误:
[2019-05-22 22:33:37.621]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
Error: no se ha encontrado o cargado la clase principal 1600
这是我的yarn-site.xml
<configuration>
<!-- Site specific YARN configuration properties -->
<property>
<name>yarn.resourcemanager.hostname</name>
<value>nodo1</value>
</property>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.mapreduce_shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
<property>
<name>yarn.application.classpath</name>
<value>
/opt/hadoop/etc/hadoop,
/opt/hadoop/share/hadoop/common/*,
/opt/hadoop/share/hadoop/common/lib/*,
/opt/hadoop/share/hadoop/hdfs/*,
/opt/hadoop/share/hadoop/hdfs/lib/*,
/opt/hadoop/share/hadoop/mapreduce/*,
/opt/hadoop/share/hadoop/mapreduce/lib/*,
/opt/hadoop/share/hadoop/tools/*,
/opt/hadoop/share/hadoop/tools/lib/*,
/opt/hadoop/share/hadoop/yarn/*,
/opt/hadoop/share/hadoop/yarn/lib/*
</value>
</property>
<property>
<name>yarn.nodemanager.vmem-check-enabled</name>
<value>false</value>
</property>
</configuration>
关于可能是什么错误的任何想法?它没有说哪个类没有被加载。
谢谢!