我已经在我的服务器上成功安装了Hadoop(ubuntu 16.04)。我正试图用./start-dfs启动hadoop。一切顺利
songdian@sparkmaster~/local/hadoop/hadoop-2.6.0/sbin$ ./start-dfs.sh
18/01/15 00:00:37 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [sparkmaster]
sparkmaster: starting namenode, logging to /home/songdian/local/hadoop/hadoop-2.6.0/logs/hadoop-songdian-namenode-sparkmaster.out
sparkworker1: starting datanode, logging to /home/songdian/local/hadoop/hadoop-2.6.0/logs/hadoop-songdian-datanode-sparkworker1.out
Starting secondary namenodes [sparkmaster]
sparkmaster: starting secondarynamenode, logging to /home/songdian/local/hadoop/hadoop-2.6.0/logs/hadoop-songdian-secondarynamenode-sparkmaster.out
18/01/15 00:00:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
这里的jps显示了namenode(我也启动了spark)
songdian@sparkmaster~/local/hadoop/hadoop-2.6.0/sbin$ jps
27344 ResourceManager
31233 Jps
30753 NameNode
24919 Master
然而,正因为我试图在HDFS上新建一个文件。有一个错误说:
songdian@sparkmaster~/local/hadoop/hadoop-2.6.0/sbin$ hadoop fs -mkdir /data
18/01/15 00:34:14 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: Call From sparkmaster/10.12.3.2 to sparkmaster:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
我正在寻找问题所在。当我试图阻止Hadoop时,有一个线索。
songdian@sparkmaster~/local/hadoop/hadoop-2.6.0/sbin$ ./stop-dfs.sh
18/01/15 00:14:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Stopping namenodes on [sparkmaster]
sparkmaster: no namenode to stop
sparkworker1: no datanode to stop
Stopping secondary namenodes [sparkmaster]
sparkmaster: no secondarynamenode to stop
18/01/15 00:14:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
此外,我检查了我的工作节点(sparkworker1),jps中没有datanode。
songdian@sparkworker1:~$ jps
19861 NodeManager
18862 Worker
25262 Jps
我确信我已将hadoop-env.sh, yarn-env.sh, mapred-env.sh, core-site.xml, hdfs-site.xml, mapred-site.xml
和yarn-site.xml
配置为正确。
任何可以帮助我的人?