运行mongo-hadoop流时出现以下错误:
java.io.IOException: Cannot run program "mapper.py": error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:214)
at org.apache.hadoop.streaming.PipeMapper.configure(PipeMapper.java:66)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:387)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1157)
at org.apache.hadoop.mapred.Child.main(Child.java:264)
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:53)
at java.lang.ProcessImpl.start(ProcessImpl.java:91)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
... 24 more
通过dumbo或常规方式运行标准Hadoop python流没有问题。
在另一个post
中的Hadoop python流中提到了这个错误我正在经营这样的工作:
hadoop jar /Volumes/Locodrive/hadoop/mongo-hadoop/streaming/target/mongo-hadoop-streaming-assembly-1.1.0-SNAPSHOT.jar -mapper mapper.py -file mapper.py -reducer reducer.py -file reducer.py -inputURI mongodb://localhost:27017/testdb.docs -outputURI mongodb://localhost:27017/testdb.testhadoop
在mapper.py/reducer.py上使用路径/绝对路径,在-file参数中添加abolute路径无济于事。标准的Hadoop流媒体工作没有任何问题,所以我没有收到错误。
将mapper.py
和reducer.py
添加到hdfs也无济于事。
mapper.py
和reducer.py
是可执行的,并且在第一行中有一个shebang:
mapper.py
#!/usr/bin/env python
import sys
sys.path.append(".")
from pymongo_hadoop import BSONMapper
def mapper(documents):
i = 0
for doc in documents:
i += 1
yield {"_id": "test", "count": 1}
BSONMapper(mapper)
print >> sys.stderr, "Done Mapping!!!"
reducer.py
#!/usr/bin/env python
# encoding: utf-8
import sys
sys.path.append('.')
from pymongo_hadoop import BSONReducer
def reducer(key, values):
print >> sys.stderr, "Processing key %s" % key
_count = 0
for v in values:
_count += v["count"]
return {"_id": key, "count": _count}
BSONReducer(reducer)
我在OSX上运行cloudera Hadoop CDH3u3。 Java示例没有问题
更新
我尝试了0.23.1并得到了同样的错误。
运行-debug不会删除PackagedJobJar streamjob.jar
当我提取它时,mapper.py
和reducer.py
就在那里
运行std流式传输作业时,这些文件也存在。 mongo-haddoop-streaming仍会产生上述错误
答案 0 :(得分:0)
k让它发挥作用 它应该是-files而不是-file
[http://hadoop.apache.org/common/docs/r0.20.0/api/org/apache/hadoop/util/GenericOptionsParser.html] [1]
hadoop jar /Volumes/Locodrive/hadoop/mongo-hadoop/streaming/target/mongo-hadoop-streaming-assembly-1.1.0-SNAPSHOT.jar -files mapper.py,reducer.py -inputURI mongodb://127.0.0.1:27017/mongo_hadoop.yield_historical.in -outputURI mongodb://127.0.0.1:27017/mongo_hadoop.testhadoop -mapper mapper.py -reducer reducer.py -verbose -debug