我的mac上运行了一个节点集群,可以在hdfs中创建一个dir并放置一个文件
hadoop fs -mkdir sampleData
hadoop fs -put ~/SampleTweets.csv sampleData
但是当我运行如下的简单流式传输作业时会出现错误(如下所示)
hadoop jar
~/cloudera/cdh5.5/hadoop/share/hadoop/mapreduce1/contrib/streaming/hadoop-
streaming-2.6.0-mr1-cdh5.5.1.jar \
-Dmapred.reduce.tasks=1 \
-input sampleData \
-output sampleData/output1 \
-mapper cat \
-reducer "wc -l"
屏幕错误和http://localhost:8088/cluster/app/application_1453348020587_0005
Application application_1453348020587_0005 failed 2 times due to Error launching appattempt_1453348020587_0005_000002.
Got exception: java.io.IOException: Failed on local exception: java.net.SocketException:
Host is down; Host Details : local host is: "tanna-iMac.local/192.168.1.13";
destination host is: "192.168.1.4":58893;
我的/ etc / hosts:
cat /etc/hosts
127.0.0.1 localhost
255.255.255.255 broadcasthost
192.168.1.13 localhost
::1 localhost
fe80::1%lo0 localhost
不知道它从哪里拿起目的地主机192.168.1.4?以及在端口58893上运行的服务