我在ubuntu中安装了hadoop,运行正常。
ubuntu:/home/hduser/hive-0.10.0-cdh4.3.1$ jps
2702 DataNode
3101 ResourceManager
4879 Jps
2948 SecondaryNameNode
3306 NodeManager
hadoop_version = Hadoop 2.0.0-cdh4.3.0
然后我从apache tarball安装了hive(hiv version-hive-0.10.0),我尝试运行bin / hive。但我收到了以下错误:
无法确定Hadoop版本信息。
hadoop version
已退回:
/home/hduser/hadoop/etc/hadoop /usr/lib/jvm/jdk1.6.0_45/
Hadoop 2.0.0-cdh4.3.0
Subversion file:///var/lib/jenkins/workspace/CDH4.3.0-Packaging-Hadoop/build/cdh4/hadoop/2.0.0-cdh4.3.0/source/hadoop-common-project/hadoop-common -r 48a9315b342ca16de92fcc5be95ae3650629155a
Compiled by jenkins on Mon May 27 19:06:57 PDT 2013
From source with checksum a4218d77f9b12df4e3e49ef96f9d357d
This command was run using /home/hduser/hadoop/share/hadoop/common/hadoop-common-2.0.0-cdh4.3.0.jar
我尝试通过我的脚本知识解决但不能。当我伸出努力时,我发现它在下面的线上失败了:
if [[ "$HADOOP_VERSION" =~ $hadoop_version_re ]]; then
我尝试echo HADOOP_VERSION
它没有返回任何内容,HADOOP_VERSION被定义为
HADOOP_VERSION=$($HADOOP version | awk '{if (NR == 1) {print $2;}}');
和$HADOOP version
让我
/home/hduser/hadoop/etc/hadoop
/usr/lib/jvm/jdk1.6.0_45/
Hadoop 2.0.0-cdh4.3.0
Subversion file:///var/lib/jenkins/workspace/CDH4.3.0-Packaging-Hadoop/build/cdh4/hadoop/2.0.0-cdh4.3.0/source/hadoop-common-project/hadoop-common -r 48a9315b342ca16de92fcc5be95ae3650629155a
Compiled by jenkins on Mon May 27 19:06:57 PDT 2013
From source with checksum a4218d77f9b12df4e3e49ef96f9d357d
This command was run using /home/hduser/hadoop/share/hadoop/common/hadoop-common-2.0.0-cdh4.3.0.jar
我现在被打了一个星期。请帮帮我。感谢。
答案 0 :(得分:1)
您的问题已经描述了这个问题。
当脚本执行$HADOOP version
时,它会预期输出如下:
Hadoop 2.0.0-cdh4.3.0
Subversion file:///var/lib/jenkins/workspace/CDH4.3.0-Packaging-Hadoop/build/cdh4/hadoop/2.0.0-cdh4.3.0/source/hadoop-common-project/hadoop-common -r 48a9315b342ca16de92fcc5be95ae3650629155a
Compiled by jenkins on Mon May 27 19:06:57 PDT 2013
From source with checksum a4218d77f9b12df4e3e49ef96f9d357d
This command was run using /home/hduser/hadoop/share/hadoop/common/hadoop-common-2.0.0-cdh4.3.0.jar`
相反,其他一些输出依旧(可能是因为您修改了Hadoop中的一些脚本。检查conf / hadoop-env.sh):
/home/hduser/hadoop/etc/hadoop
/usr/lib/jvm/jdk1.6.0_45/
Hadoop 2.0.0-cdh4.3.0
Subversion file:///var/lib/jenkins/workspace/CDH4.3.0-Packaging-Hadoop/build/cdh4/hadoop/2.0.0-cdh4.3.0/source/hadoop-common-project/hadoop-common -r 48a9315b342ca16de92fcc5be95ae3650629155a
Compiled by jenkins on Mon May 27 19:06:57 PDT 2013
From source with checksum a4218d77f9b12df4e3e49ef96f9d357d
This command was run using /home/hduser/hadoop/share/hadoop/common/hadoop-common-2.0.0-cdh4.3.0.jar`
现在awk行不再找到所需的数字(在第2位)。
因此,解决方案是找出额外输出的来源并将其删除。
答案 1 :(得分:1)
我有同样的问题,我通过在.profile中包含以下内容并再次采购来修复它。
导出HADOOP_VERSION =" 2.0.0-cdh4.2.0"
答案 2 :(得分:0)
在Windows上,您可能遇到同样的问题。
实际上,如果将$HADOOP_HOME
设置为dos路径(例如:C:\hadoop
),则需要在cygwin中更改它。一种方法是将以下行放在.bashrc中:
export HADOOP_HOME="$(cygpath $HADOOP_HOME)"
答案 3 :(得分:0)
hduser@ubuntu:/usr/local/hadoop/sbin$ hadoop version
答案 4 :(得分:0)
检查你的JAR PATH(JRE_HOME
)
答案 5 :(得分:0)
如果您在.bashrc文件中设置了export HADOOP_VERSION=2.0.0-cdh4.3.0
或您的版本号,请对其进行评论或将#放在前面,如#export HADOOP_VERSION=2.0.0-cdh4.3.0
,然后运行配置单元,您将能够解决问题。