无法将数据节点添加到Hadoop

时间:2013-05-13 10:09:08

标签: hadoop

我的所有设置都正确,我可以在单个节点上运行Hadoop ( 1.1.2 )。但是,在对相关文件(/ etc / hosts,* -site.xml)进行更改后,我无法将 Datanode 添加到群集中,并且我不断收到以下错误:奴隶。

有人知道如何纠正这个问题吗?

2013-05-13 15:36:10,135 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
2013-05-13 15:36:11,137 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
2013-05-13 15:36:12,140 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

2 个答案:

答案 0 :(得分:0)

检查core-site.xml conf文件中fs.default.name的值(在群集中的每个节点上)。这需要是名称节点的网络名称,我怀疑你将其作为hdfs://localhost:54310)。

未能检查群集中所有节点上hadoop配置文件中是否提及localhost:

grep localhost $HADOOP_HOME/conf/*.xml

答案 1 :(得分:0)

尝试使用namenode的ip地址或网络名称

重新扩展localhost