我正在运行here的rmr2示例,这是我尝试过的代码:
Sys.setenv(HADOOP_HOME="/home/istvan/hadoop")
Sys.setenv(HADOOP_CMD="/home/istvan/hadoop/bin/hadoop")
library(rmr2)
library(rhdfs)
ints = to.dfs(1:100)
calc = mapreduce(input = ints,
map = function(k, v) cbind(v, 2*v))
我正在使用hadoop-streaming-1.1.1.jar,在调用mapreduce函数作业启动后,它失败并出现异常:
2013-12-16 16:26:14,844 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
2013-12-16 16:26:15,600 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /app/cloudera/mapred/local/taskTracker/nkumar/jobcache/job_201312160142_0009/jars/job.jar <- /app/cloudera/mapred/local/taskTracker/nkumar/jobcache/job_201312160142_0009/attempt_201312160142_0009_m_000000_0/work/job.jar
2013-12-16 16:26:15,604 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /app/cloudera/mapred/local/taskTracker/nkumar/jobcache/job_201312160142_0009/jars/.job.jar.crc <- /app/cloudera/mapred/local/taskTracker/nkumar/jobcache/job_201312160142_0009/attempt_201312160142_0009_m_000000_0/work/.job.jar.crc
2013-12-16 16:26:15,693 WARN org.apache.hadoop.conf.Configuration: session.id is deprecated. Instead, use dfs.metrics.session-id
2013-12-16 16:26:15,695 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=MAP, sessionId=
2013-12-16 16:26:16,312 INFO org.apache.hadoop.util.ProcessTree: setsid exited with exit code 0
2013-12-16 16:26:16,319 INFO org.apache.hadoop.mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@6bdc64a5
2013-12-16 16:26:16,757 WARN mapreduce.Counters: Counter name MAP_INPUT_BYTES is deprecated. Use FileInputFormatCounters as group name and BYTES_READ as counter name instead
2013-12-16 16:26:16,763 INFO org.apache.hadoop.mapred.MapTask: numReduceTasks: 1
2013-12-16 16:26:16,772 INFO org.apache.hadoop.mapred.MapTask: Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
2013-12-16 16:26:16,779 INFO org.apache.hadoop.mapred.MapTask: io.sort.mb = 450
2013-12-16 16:26:17,432 INFO org.apache.hadoop.mapred.MapTask: data buffer = 358612992/448266240
2013-12-16 16:26:17,432 INFO org.apache.hadoop.mapred.MapTask: record buffer = 1179648/1474560
2013-12-16 16:26:17,477 INFO org.apache.hadoop.streaming.PipeMapRed: PipeMapRed exec [/usr/bin/Rscript, ./rmr-streaming-map5b17a2a9ff]
2013-12-16 16:26:17,561 INFO org.apache.hadoop.streaming.PipeMapRed: R/W/S=1/0/0 in:NA [rec/s] out:NA [rec/s]
2013-12-16 16:26:17,570 INFO org.apache.hadoop.streaming.PipeMapRed: MRErrorThread done
2013-12-16 16:26:17,571 INFO org.apache.hadoop.streaming.PipeMapRed: PipeMapRed failed!
2013-12-16 16:26:17,587 INFO org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-12-16 16:26:17,591 WARN org.apache.hadoop.mapred.Child: Error running child
java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 2
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:362)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:576)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:135)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:36)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:417)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
2013-12-16 16:26:17,605 INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task
在hdfs上的/ tmp目录中创建一个序列文件。任何建议,以解决它。谢谢。
编辑:
找到了这个答案Hadoop Streaming Job failed error in python所以我尝试在顶部用这两行执行r脚本:
#!/usr/bin/Rscript
#!/usr/bin/env Rscript
没有运气。