名称节点和辅助名称节点未启动

时间:2020-03-31 03:08:51

标签: hadoop hdfs

我是Hadoop世界的新手,我正在尝试从tutorial中学习基础知识。我已经以伪分布式方式在计算机上全新安装了Hadoop 2.9.2,并进行了以下设置。

core-site.xml

<configuration>
  <propeerty>
    <name>fs.defaultFS</name>
    <value>hdfs://localhost:9000</value>
  </propeerty>
</configuration>

hdfs-site.xml

<configuration>
  <property>
    <name>dfs.replication</name>
    <value>1</value>
  </property>
</configuration>

mapred-site.xml

<configuration>
  <property>
    <name>mapreduce.framework.name</name>
    <value>yarn</value>
  </property>
</configuration>

yarn-site.xml

<configuration>
  <property>
    <name>yarn.nodemanager.aux-services</name>
    <value>mapreduce_shuffle</value>
  </property>
</configuration>

跑了bin/hdfs namenode -format并没有出现任何错误,然后最终跑了sbin/start-dfs.sh,在以下日志中我没有出现错误

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/Users/myusername/hadoop-install/hadoop-2.9.2/share/hadoop/common/lib/hadoop-auth-2.9.2.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
20/03/30 21:44:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.
Starting namenodes on []
Password:
localhost: starting namenode, logging to /Users/myusername/hadoop-install/hadoop-2.9.2/logs/hadoop-myusername-namenode-C02WT020HTDG.out
Password:
localhost: starting datanode, logging to /Users/myusername/hadoop-install/hadoop-2.9.2/logs/hadoop-myusername-datanode-C02WT020HTDG.out
localhost: WARNING: An illegal reflective access operation has occurred
localhost: WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/Users/myusername/hadoop-install/hadoop-2.9.2/share/hadoop/common/lib/hadoop-auth-2.9.2.jar) to method sun.security.krb5.Config.getInstance()
localhost: WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
localhost: WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
localhost: WARNING: All illegal access operations will be denied in a future release
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is SHA256:M1QP8tl98stYKNcIBmKYTuRoasil3AafGqIq3FZ1Vv8.
Are you sure you want to continue connecting (yes/no/[fingerprint])? yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (ECDSA) to the list of known hosts.
Password:
0.0.0.0: starting secondarynamenode, logging to /Users/myusername/hadoop-install/hadoop-2.9.2/logs/hadoop-myusername-secondarynamenode-C02WT020HTDG.out
0.0.0.0: WARNING: An illegal reflective access operation has occurred
0.0.0.0: WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/Users/myusername/hadoop-install/hadoop-2.9.2/share/hadoop/common/lib/hadoop-auth-2.9.2.jar) to method sun.security.krb5.Config.getInstance()
0.0.0.0: WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
0.0.0.0: WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
0.0.0.0: WARNING: All illegal access operations will be denied in a future release
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/Users/myusername/hadoop-install/hadoop-2.9.2/share/hadoop/common/lib/hadoop-auth-2.9.2.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
20/03/30 21:44:48 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  [1]: https://app.pluralsight.com/library/courses/building-blocks-hadoop-hdfs-mapreduce-yarn/table-of-contents

“名称”节点日志中的错误:

2020-03-31 09:35:56,962 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.io.IOException: Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.
    at org.apache.hadoop.hdfs.DFSUtil.getNNServiceRpcAddressesForCluster(DFSUtil.java:576)
    at org.apache.hadoop.hdfs.server.datanode.BlockPoolManager.refreshNamenodes(BlockPoolManager.java:152)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1392)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:495)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2695)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2598)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2645)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2789)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2813)
2020-03-31 09:35:56,966 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1: java.io.IOException: Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.

次要名称节点日志:

2020-03-31 09:36:05,237 FATAL org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: Failed to start secondary namenode
java.lang.IllegalArgumentException: Invalid URI for NameNode address (check fs.defaultFS): file:/// has no authority.
    at org.apache.hadoop.hdfs.DFSUtilClient.getNNAddress(DFSUtilClient.java:626)
    at org.apache.hadoop.hdfs.DFSUtilClient.getNNAddressCheckLogical(DFSUtilClient.java:655)
    at org.apache.hadoop.hdfs.DFSUtilClient.getNNAddress(DFSUtilClient.java:617)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:519)
    at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:231)
    at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:194)
    at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:690)
2020-03-31 09:36:05,241 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1: ExitException

我去localhost:50070时一无所获(无法访问站点),当我点击jps时,我看不到任何名称节点或辅助名称节点在运行。

我做了什么?

是否需要配置某些内容或缺少任何内容?

预先感谢!

1 个答案:

答案 0 :(得分:1)

core-site.xml属性标记有错字!它必须是property而不是propeerty

<configuration>
  <property>
    <name>fs.defaultFS</name>
    <value>hdfs://localhost:9000</value>
  </property>
</configuration>

此外,在hdfs-site.xml中您没有提到dfs.namenode.name.dirdfs.datanode.data.dir属性。始终建议配置这些属性,否则将hadoop.tmp.dir属性(在core-site.xml中)配置为/tmp以外的某个目录。

更新这些属性,格式化名称节点并启动服务。