我正在尝试在新重新格式化的Linux SUSE服务器上安装Hadoop 3.0。通常我认为我做得正确。但没有hadoop命令正在工作。
例如,我尝试运行hadoop conftest
,但收到错误Error: Could not find or load main class org.apache.hadoop.util.ConfTest
。
我修改了hadoop文件。我把这个节包括在bin / hadoop文件的顶部附近:
set -xv
然后我跑了hadoop conftest
,这是最后30行输出:
++ HADOOP_OPTS='-Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=ec2-user -Dhadoop.root.logger=INFO,console -Dhadoop.policy.file=hadoop-policy.xml -Dhadoop.security.logger=INFO,NullAppender'
+ [[ - = \ ]]
+ hadoop_debug 'HADOOP_OPTS accepted -Dhadoop.security.logger=INFO,NullAppender'
+ [[ -n '' ]]
+ hadoop_translate_cygwin_path HADOOP_HOME
+ [[ false = \t\r\u\e ]]
+ hadoop_translate_cygwin_path HADOOP_CONF_DIR
+ [[ false = \t\r\u\e ]]
+ hadoop_translate_cygwin_path HADOOP_COMMON_HOME
+ [[ false = \t\r\u\e ]]
+ hadoop_translate_cygwin_path HADOOP_HDFS_HOME
+ [[ false = \t\r\u\e ]]
+ hadoop_translate_cygwin_path HADOOP_YARN_HOME
+ [[ false = \t\r\u\e ]]
+ hadoop_translate_cygwin_path HADOOP_MAPRED_HOME
+ [[ false = \t\r\u\e ]]
+ [[ false = true ]]
+ hadoop_java_exec conftest org.apache.hadoop.util.ConfTest
+ local command=conftest
+ local class=org.apache.hadoop.util.ConfTest
+ shift 2
+ hadoop_debug 'Final CLASSPATH: /usr/local/hadoop/conf'
+ [[ -n '' ]]
+ hadoop_debug 'Final HADOOP_OPTS: -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=ec2-user -Dhadoop.root.logger=INFO,console -Dhadoop.policy.file=hadoop-policy.xml -Dhadoop.security.logger=INFO,NullAppender'
+ [[ -n '' ]]
+ hadoop_debug 'Final JAVA_HOME: /usr/lib64/jvm/jre'
+ [[ -n '' ]]
+ hadoop_debug 'java: /usr/lib64/jvm/jre/bin/java'
+ [[ -n '' ]]
+ hadoop_debug 'Class name: org.apache.hadoop.util.ConfTest'
+ [[ -n '' ]]
+ hadoop_debug 'Command line options: '
+ [[ -n '' ]]
+ export CLASSPATH
+ exec /usr/lib64/jvm/jre/bin/java -Dproc_conftest -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=ec2-user -Dhadoop.root.logger=INFO,console -Dhadoop.policy.file=hadoop-policy.xml -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.ConfTest
我的安装有什么问题?
编辑:
我跑了这个命令:
hdfs classpath
我看到了这个结果:
/usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/*:/usr/local/hadoop/share/hadoop/common/*:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/*:/usr/local/hadoop/share/hadoop/hdfs/*:/usr/local/hadoop/share/hadoop/yarn/lib/*:/usr/local/hadoop/share/hadoop/yarn/*:/usr/local/hadoop/share/hadoop/mapreduce/lib/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/contrib/capacity-scheduler/*.jar