Hadoop datanode没有运行

时间:2015-11-20 08:52:37

标签: apache hadoop mapreduce hdfs cloudera

我在笔记本电脑上安装了hadoop,所有服务都在运行,除了datanode。最初,namenode和secondary namenode未运行。我对namenode和secondary namenode做了一些更改/许可,现在没问题了。

hduse@Lenovo-IdeaPad-S510p:/usr/local/hadoop/sbin$ jps
14339 NameNode
16579 Jps
15571 NodeManager
15076 SecondaryNameNode
15231 ResourceManager

找到我的hdfs-site配置文件。

<configuration>
 <property>
  <name>dfs.replication</name>
  <value>1</value>
  <description>Default block replication.
  The actual number of replications can be specified when the file is created.
  The default is used if replication is not specified in create time.
  </description>
 </property>
 <property>
   <name>dfs.namenode.name.dir</name>
   <value>file:/usr/local/hadoop_store/hdfs/namenode</value>
 </property>
 <property>
   <name>dfs.datanode.data.dir</name>
   <value>file:/usr/local/hadoop_store/hdfs/datanode</value>
 </property>
</configuration>

目录的权限。

hduse@Lenovo-IdeaPad-S510p:/usr/local/hadoop/sbin$ ls -ld /usr/local/hadoop_store/hdfs/namenode
drwxrwxrwx 3 hduser hadoop 4096 Nov 20 13:51 /usr/local/hadoop_store/hdfs/namenode

hduse@Lenovo-IdeaPad-S510p:/usr/local/hadoop/sbin$ ls -ld /usr/local/hadoop_store/hdfs/datanode
drwxrwxrwx 2 hduser hadoop 4096 Nov 17 14:10 /usr/local/hadoop_store/hdfs/datanode

数据节点的日志文件。

hduse@Lenovo-IdeaPad-S510p:/usr/local/hadoop/sbin$ less /usr/local/hadoop/logs/hadoop-hduse-datanode-Lenovo-IdeaPad-S510p.log

......./*some data truncated*/......
STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
STARTUP_MSG:   java = 1.8.0_66
************************************************************/
2015-11-20 13:51:42,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
2015-11-20 13:51:43,305 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Invalid dfs.datanode.data.dir /usr/local/hadoop_store/hdfs/datanode : 
EPERM: Operation not permitted
        at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl(Native Method)
        at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:230)
        at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:652)
        at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:490)
        at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:140)
        at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
        at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2239)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2281)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2202)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2378)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2402)
2015-11-20 13:51:43,307 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.io.IOException: All directories in dfs.datanode.data.dir are invalid: "/usr/local/hadoop_store/hdfs/datanode/" 
        at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2290)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2202)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2378)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2402)
2015-11-20 13:51:43,309 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2015-11-20 13:51:43,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at Lenovo-IdeaPad-S510p/127.0.1.1
************************************************************/

在datanode文件目录路径中具有完全权限。我尝试清除tmp目录,格式化namnode,但仍然没有datanode工作。

建议对make datanode的成功进行任何更改。

1 个答案:

答案 0 :(得分:0)

  1. 首先删除临时文件夹中的所有内容:rm -Rf (我的是/ usr / local / hadoop / hadoop_tmp)
  2. 格式化namenode :bin / hadoop namenode -format
  3. 再次启动所有进程:bin / start-all.sh
  4. 第一次启动hadoop时,需要格式化namenode。 问题是因为这个原因。