为什么在Windows10上的Hadoop安装错误期间运行“ hdfs namenode -format”命令时出现错误?

时间:2019-10-19 20:33:45

标签: java windows hadoop

我键入“ hdfs namenode -format”命令来设置namenode。但是我遇到了如下错误。

2019-10-20 04:15:57,279 INFO util.GSet: Computing capacity for map NameNodeRetryCache
2019-10-20 04:15:57,280 INFO util.GSet: VM type       = 64-bit
2019-10-20 04:15:57,281 INFO util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
2019-10-20 04:15:57,282 INFO util.GSet: capacity      = 2^15 = 32768 entries
2019-10-20 04:15:57,372 INFO namenode.FSImage: Allocated new BlockPoolId: BP-638256157-172.17.18.209-1571516157358
2019-10-20 04:15:57,377 ERROR namenode.NameNode: Failed to start namenode.
java.lang.UnsupportedOperationException
        at java.nio.file.Files.setPosixFilePermissions(Files.java:2044)
        at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:452)
        at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:591)
        at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:613)
        at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:188)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1206)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1649)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1759)
2019-10-20 04:15:57,388 INFO util.ExitUtil: Exiting with status 1: java.lang.UnsupportedOperationException
2019-10-20 04:15:57,395 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at YXYstyle/172.17.18.209
************************************************************/

我从https://github.com/cdarlint/winutils下载了hadoop-3.2.1和winutils,以简单地覆盖如下所示的bin目录。 This is the list of hadoop-3.2.1/bin directory

下面是core-site.xml配置。

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>

hadoop-env.cmd(顺便说一句:我已经设置了JAVA_HOME的环境变量):

@rem The java implementation to use.  Required.
set JAVA_HOME=%JAVA_HOME%

@rem The jsvc implementation to use. Jsvc is required to run secure datanodes.
@rem set JSVC_HOME=%JSVC_HOME%

hdfs-site.xml:

<configuration>

    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
    <property>
        <name>dfs.namenode.name.dir</name>
        <value>/hadoop-3.2.1/data/namenode</value>
    </property>
    <property>
        <name>dfs.datanode.data.dir</name>
        <value>/hadoop-3.2.1/data/datanode</value>
    </property>
</configuration>

yarn-site.xml:

<configuration>

<!-- Site specific YARN configuration properties -->
    <property>
        <name>yarn.nodemanager.resource.memory-mb</name>
        <value>1024</value>
    </property>
    <property>
        <name>yarn.nodemanager.resource.cpu-vcores</name>
        <value>1</value>
    </property>

</configuration>

我刚进入hadoop,这是一个全新的话题。 有人可以帮忙吗?预先感谢。

1 个答案:

答案 0 :(得分:0)

这似乎是3.2.1版本中的错误。您可以看到此问题仅在四天前已解决。在早期版本中很好。我使用2.6.0进行了测试,但可以正常运行,但不能用于3.2.1

https://issues.apache.org/jira/browse/HDFS-14890