mapreduce作业提交通过java Processbuilder没有结束

时间:2013-12-19 07:52:07

标签: java hadoop process mapreduce

我有一个mareduce作为jar文件,比如'mapred.jar'。实际上Jobtracker正在远程linux机器上运行。我从本地机器运行jar文件,jar文件中的job被提交给远程jobtracker,它工作正常如下:

java -jar F:/hadoop/mapred.jar
     13/12/19 12:40:27 WARN mapred.JobClient: Use GenericOptionsParser for parsing th
     e arguments. Applications should implement Tool for the same.
     13/12/19 12:40:27 INFO input.FileInputFormat: Total input paths to process : 49
     13/12/19 12:40:27 WARN util.NativeCodeLoader: Unable to load native-hadoop libra
     ry for your platform... using builtin-java classes where applicable
     13/12/19 12:40:27 WARN snappy.LoadSnappy: Snappy native library not loaded
     13/12/19 12:40:28 INFO mapred.JobClient: Running job: job_201312160716_0063
     13/12/19 12:40:29 INFO mapred.JobClient:  map 0% reduce 0%
     13/12/19 12:40:50 INFO mapred.JobClient:  map 48% reduce 0%
     13/12/19 12:40:53 INFO mapred.JobClient:  map 35% reduce 0%
     13/12/19 12:40:56 INFO mapred.JobClient:  map 29% reduce 0%
     13/12/19 12:41:02 INFO mapred.JobClient:  map 99% reduce 0%
     13/12/19 12:41:08 INFO mapred.JobClient:  map 100% reduce 0%
     13/12/19 12:41:23 INFO mapred.JobClient:  map 100% reduce 100%
     13/12/19 12:41:28 INFO mapred.JobClient: Job complete: job_201312160716_0063
      ...

但是当我通过java的ProcessBuilder执行相同的操作时如下:

ProcessBuilder pb = new ProcessBuilder("java", "-jar", "F:/hadoop/mapred.jar");
    pb.directory(new File("D:/test"));
    final Process process = pb.start();
    InputStream is = process.getInputStream();
    InputStreamReader isr = new InputStreamReader(is);
    BufferedReader br = new BufferedReader(isr);
    String line;
    while ((line = br.readLine()) != null) {
      System.out.println(line);
    }

    System.out.println("Waited for: "+ process.waitFor());
    System.out.println("Program terminated! ");

它也有效,我可以通过http://192.168.1.112:50030/jobtracker.jsp查看工作状态。

问题

我的问题是, java程序不会结束,即使mapreduce作业完成也无限期运行 !此外,我没有得到任何输出消息,我通过命令行。我怎么知道工作完成?

1 个答案:

答案 0 :(得分:2)

在开始阅读之前,您应该将stderr重定向到stdout:

pb.redirectErrorStream(true)

原因在Process类的文档中描述:

  

...无法及时写入输入流或读取子进程的输出流可能导致子进程被阻塞,甚至死锁。

如果您使用的是Java 7,其中ProcessBuilder和Process得到了显着改进,您也可以这样做

pb.inheritIO()

将进程的stderr和stdout重定向到您的Java进程。

更新顺便说一句,您最好使用Hadoop API(类作业和配置)提交Hadoop作业,例如, Calling a mapreduce job from a simple java program