我正在尝试使用100%基于wordcount的hadoop hello世界代码,但是当我尝试将这段代码作为一个带有其余api的工作运行时,fiware平台会返回错误。
以下代码在我用于测试建议的私有hadoop集群中运行良好,但没有在fi-ware平台内部,我不知道为什么。
代码就是这样:
package smartive;
import java.io.IOException;
import java.util.*;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapred.*;
import org.apache.hadoop.util.*;
public class Hello {
public static class Map extends MapReduceBase implements Mapper < LongWritable, Text, Text, IntWritable > {
private final static IntWritable one = new IntWritable(1);
private Text word = new Text();
public void map(LongWritable key, Text value, OutputCollector < Text, IntWritable > output, Reporter reporter) throws IOException {
String line = value.toString();
StringTokenizer tokenizer = new StringTokenizer(line);
while (tokenizer.hasMoreTokens()) {
word.set(tokenizer.nextToken());
output.collect(word, one);
}
}
}
public static class Reduce extends MapReduceBase implements Reducer < Text,
IntWritable,
Text,
IntWritable > {
public void reduce(Text key, Iterator < IntWritable > values, OutputCollector < Text, IntWritable > output, Reporter reporter) throws IOException {
int sum = 0;
while (values.hasNext()) {
sum += values.next().get();
}
output.collect(key, new IntWritable(sum));
}
}
public static void main(String[] args) throws Exception {
JobConf conf = new JobConf(Hello.class);
conf.setJobName("Hello");
conf.setOutputKeyClass(Text.class);
conf.setOutputValueClass(IntWritable.class);
conf.setMapperClass(Map.class);
conf.setCombinerClass(Reduce.class);
conf.setReducerClass(Reduce.class);
conf.setInputFormat(TextInputFormat.class);
conf.setOutputFormat(TextOutputFormat.class);
FileInputFormat.setInputPaths(conf, new Path(args[0]));
FileOutputFormat.setOutputPath(conf, new Path(args[1]));
JobClient.runJob(conf);
}
}
我使用javac编译以前的代码,我创建了一个jar包,其入口点设置为smartive.Hello。
我对cosmos.lab.fiware执行以下休息调用:
curl -X POST "http://computing.cosmos.lab.fiware.org:12000/tidoop/v1/user/myuser/jobs" -d '{"jar":"Hello.jar","class_name":"Hello","lib_jars":"","input":"data/in","output":"data/out"}' -H "Content-Type: application/json" -H "X-Auth-Token: myOauth2token"
但是我得到了结果
{"success":"false","error":1}
预期的反应必须如下:
{"success":"true","job_id": "job_1460639183882_0001"}
但我不知道如何在新的宇宙上进行调试,因为没有ssh接口且响应没有错误信息。
文件在数据/文件夹(test.txt文件)中是正常的,文件夹数据和数据/具有777权限。
任何人都有一些提示/想法在哪里/我做错了什么?
非常感谢。