配置多个节点后,hdfs dfs ls无法正常工作

时间:2016-06-19 12:31:00

标签: hadoop

我开始关注在我的单个本地VM上配置多个ndo的在线教程。这是主节点上的主机:

127.0.0.1   localhost
192.168.96.132  hadoop
192.168.96.135  hadoop1
192.168.96.136  hadoop2

ssh:ALL:allow
sshd:ALL:allow

以下是以前使用的命令:hdfs dfs -ls

现在我看到以下错误消息:

ls: Call From hadoop/192.168.96.132 to hadoop:9000 failed on connection exception: 
java.net.ConnectException: Connection refused; 
For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

我的配置有什么问题?我应该在哪里检查并纠正它?

非常感谢。

1 个答案:

答案 0 :(得分:0)

First try to 
ping hadoop,
ping hadoop1 and 
ping hadoop2.
Ex: ping hadoop 
Then just try to connect via ssh
The syntax is 
ssh username@hadoop
ssh username@hadoop1
ssh username@hadoop2
Then see the results to find out whether the systems are connecting or not.