Hadoop 2.6.1单节点设置:数据节点未启动

时间:2015-10-07 07:33:56

标签: hadoop hdfs

我尝试根据here

的说明设置Hadoop 2.6.1

但是我的数据节点没有启动。当我做JPS时,我只得到以下过程

▶ jps
8406 ResourceManager
7744 NameNode
8527 NodeManager
8074 SecondaryNameNode
9121 Jps

DataNode日志:

2015-10-07 13:02:24,144 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Invalid dfs.datanode.data.dir /home/vinod/.hadoopdata/hdfs/datanode : 
EPERM: Operation not permitted
        at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl(Native Method)
        at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:230)
        at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:652)
        at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:490)
        at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:140)
        at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
        at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2299)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2341)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2323)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2215)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2262)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2438)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2462)
2015-10-07 13:02:24,147 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.io.IOException: All directories in dfs.datanode.data.dir are invalid: "/home/vinod/.hadoopdata/hdfs/datanode/" 
        at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2350)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2323)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2215)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2262)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2438)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2462)
2015-10-07 13:02:24,148 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2015-10-07 13:02:24,150 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at BBDSK0201/127.0.1.1
************************************************************/

请帮助我找不到的东西

1 个答案:

答案 0 :(得分:1)

1)确保目录拥有正确的所有者和权限。

$ sudo chown -R hduser:hadoop /home/vinod/.hadoopdata/hdfs/datanode

2)删除tmp目录中给出的内容。这是hadoop.tmp.dir

的参数

3)格式化namenode。

再次启动所有流程。希望这会有所帮助...