Mac OS X上的Hadoop' HadoopIllegalArgumentException:XAttr名称必须加前缀'错误

时间:2015-01-29 15:59:30

标签: java hadoop

今天我尝试按照Setting up Hadoop 2.4 and Pig 0.12 on OSX locally

上的说明在我的Mac OS X Lion上安装Hadoop

我已正确设置

  

JAVA_HOME = / Library / Java / JavaVirtualMachines / jdk1.7.0_71.jdk / Contents / Home设置〜/ .bash_profile和.bashrc

并使用brew成功安装了最新版本的Hadoop(2.6.0)并编辑了这4个配置文件: hdfs.site.xml,core-site.xml,mapred-site.xml,yarn-site。 xml 因此。

但是跑步:

  

./ bin / hdfs namenode -format

给出:

15/01/29 17:42:01 INFO namenode.NameNode: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = Venuses-Mac-mini.local/192.168.1.51
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 2.6.0
STARTUP_MSG:   classpath = /usr/local/Cellar/hadoop/2.6.0/libexec    /etc/hadoop:/usr/local/Cellar/hadoop/2.6.0/libexec/share/hadoop/common    /lib/activation-1.1.jar <TRUNCATED - Big Chunk of Code Containing .jar Filenames> 
STARTUP_MSG:   build = https://git-wip-us.apache.org/repos    /asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
STARTUP_MSG:   java = 1.6.0_29
<TRUNCATED - Big Chunk of .jar Filnames TRUNCATED>
************************************************************/
15/01/29 17:42:01 INFO namenode.NameNode: registered UNIX signal    handlers for [TERM, HUP, INT]
15/01/29 17:42:01 INFO namenode.NameNode: createNameNode [-format]
2015-01-29 17:42:02.551 java[1016:1903] Unable to load realm info     from SCDynamicStore
15/01/29 17:42:02 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Formatting using clusterid: CID-aaa7a5a6-3e82-4166-8039-16046f1b4761
<TRUNCATED>
15/01/29 17:42:03 ERROR namenode.FSNamesystem: FSNamesystem initialization failed.
org.apache.hadoop.HadoopIllegalArgumentException: An XAttr name must  be  prefixed with user/trusted/security/system/raw, followed by a '.'
at org.apache.hadoop.hdfs.XAttrHelper.buildXAttr(XAttrHelper.java:72)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.<init>(FSDirectory.java:137)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:894)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:755)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:934)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1379)
at   org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1504)
15/01/29 17:42:03 INFO namenode.FSNamesystem: Stopping services started for active state
15/01/29 17:42:03 INFO namenode.FSNamesystem: Stopping services started for standby state
15/01/29 17:42:03 FATAL namenode.NameNode: Failed to start namenode.
<TRUNCATED>
15/01/29 17:42:03 INFO util.ExitUtil: Exiting with status 1
15/01/29 17:42:03 INFO namenode.NameNode: SHUTDOWN_MSG: 

我的Mac上安装了2个Java版本Hadoop采用旧版本1.6.0_29,而不是当前版本1.7.0_72。我不知道Hadoop是否考虑当前的Java版本。

注意:我在Google上进行了大量搜索,无法找到针对此特定错误的解决方案。

感谢。

1 个答案:

答案 0 :(得分:0)

在终端shell中运行以下内容后会得到什么?

/usr/libexec/java_home

如果返回你的1.6 JDK,那么可能是某个地方hadoop正在使用该命令来确定使用哪个java。例如,可能发生的一个地方是文件&#39; /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/hadoop-env.sh'。第25行是:

export JAVA_HOME="$(/usr/libexec/java_home)"

尝试将其更改为:

export JAVA_HOME="$(/usr/libexec/java_home -v1.7)"

为了将JAVA_HOME设置为1.7 JDK。或者可能是其他一些hadoop文件正在做类似的事情来找到Java。