hadoop单节点:[致命错误] core-site.xml:11:17:尾随部分不允许内容

时间:2016-04-22 13:53:04

标签: hdfs

我无法启动namenode。 hadoop版本是hadoop-1.2.1。我已经格式化了名称节点,清除了/ app / hadoop / tmp中的tmp目录。下面提到的是core-site.xml中的配置

<configuration>
     <property>
         <name>fs.default.name</name>
         <value>hdfs://localhost:9000</value>
     </property>
</configuration>>

hdfs-site.xml

<configuration>
     <property>
         <name>dfs.replication</name>
         <value>1</value>
     </property>
</configuration>

mapred-site.xml中

<configuration>
     <property>
         <name>mapred.job.tracker</name>
         <value>localhost:9001</value>
     </property>
</configuration>

这是显示的错误

priyank@priyank-Ideapad-Z570:~$ hadoop namenode
16/04/22 19:03:35 INFO namenode.NameNode: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = priyank-Ideapad-Z570/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.2.1
STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1503152; compiled by 'mattf' on Mon Jul 22 15:23:09 PDT 2013
STARTUP_MSG:   java = 1.7.0_95
************************************************************/
[Fatal Error] core-site.xml:11:17: Content is not allowed in trailing section.
16/04/22 19:03:35 FATAL conf.Configuration: error parsing conf file: org.xml.sax.SAXParseException; systemId: file:/home/priyank/Desktop/Softwares/Hadoop/hadoop-1.2.1/conf/core-site.xml; lineNumber: 11; columnNumber: 17; Content is not allowed in trailing section.
16/04/22 19:03:35 ERROR namenode.NameNode: java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: file:/home/priyank/Desktop/Softwares/Hadoop/hadoop-1.2.1/conf/core-site.xml; lineNumber: 11; columnNumber: 17; Content is not allowed in trailing section.
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1249)
    at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1107)
    at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1053)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:420)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.setStartupOption(NameNode.java:1374)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1463)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)
Caused by: org.xml.sax.SAXParseException; systemId: file:/home/priyank/Desktop/Softwares/Hadoop/hadoop-1.2.1/conf/core-site.xml; lineNumber: 11; columnNumber: 17; Content is not allowed in trailing section.
    at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:257)
    at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:338)
    at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:177)

1 个答案:

答案 0 :(得分:0)

使用stop-all.sh停止了所有恶魔。清除/tmp目录中的所有文件。使用start-all.sh启动了所有恶魔。

我检查了之前和之后的日志,这是我找到的。

糟糕的日志:

2016-04-23 08:07:18,729 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemStateMBean and NameNodeMXBean
2016-04-23 08:07:18,800 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names occuring more than 10 times 
2016-04-23 08:07:18,816 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory /tmp/hadoop-priyank/dfs/name does not exist.
2016-04-23 08:07:18,822 ERROR org.apache.hadoop.hdfs.server.namenode.FSNamesystem: FSNamesystem initialization failed.
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /tmp/hadoop-priyank/dfs/name is in an inconsistent state: storage d
irectory does not exist or is not accessible.
    at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:303)

好日志:

2016-04-23 09:28:30,205 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemStateMBean and NameNodeMXBean
2016-04-23 09:28:30,251 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names occuring more than 10 times 
2016-04-23 09:28:30,332 INFO org.apache.hadoop.hdfs.server.common.Storage: Number of files = 1
2016-04-23 09:28:30,344 INFO org.apache.hadoop.hdfs.server.common.Storage: Number of files under construction = 0
2016-04-23 09:28:30,344 INFO org.apache.hadoop.hdfs.server.common.Storage: Image file of size 113 loaded in 0 seconds.
2016-04-23 09:28:30,345 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: EOF of /tmp/hadoop-priyank/dfs/name/current/edits, reached end of

也许名称节点没有与脚本格式化,并且我编写了/tmp目录。它开始!!