当我尝试执行此命令时出现此错误:$bin/hadoop namenode –format
/home/MAHI/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 31: unexpected EOF while looking for matching `"'
/home/MAHI/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 58: syntax error: unexpected end of file**
# The java implementation to use. Required.
export JAVA_HOME= "C:\Java\"
# Extra Java CLASSPATH elements. Optional.
# export HADOOP_CLASSPATH=
# The maximum amount of heap to use, in MB. Default is 1000.
# export HADOOP_HEAPSIZE=2000
# Extra Java runtime options. Empty by default.
# export HADOOP_OPTS=-server
# Command specific options appended to HADOOP_OPTS when specified
export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_NAMENODE_OPTS"
export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_SECONDARYNAMENODE_OPTS"
export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_DATANODE_OPTS"
export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote $HADOOP_BALANCER_OPTS"
export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote $HADOOP_JOBTRACKER_OPTS"
export HADOOP_TASKTRACKER_OPTS=
The following applies to multiple commands (fs, dfs, fsck, distcp etc)
export HADOOP_CLIENT_OPTS
Extra ssh options. Empty by default.
export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o SendEnv=HADOOP_CONF_DIR"
Where log files are stored. $HADOOP_HOME/logs by default.
export HADOOP_LOG_DIR=${HADOOP_HOME}/logs
File naming remote slave hosts. $HADOOP_HOME/conf/slaves by default.
export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves
host:path where hadoop code should be rsync'd from. Unset by default.
export HADOOP_MASTER=master:/home/$USER/src/hadoop
Seconds to sleep between slave commands. Unset by default. This
can be useful in large clusters, where, e.g., slave rsyncs can
otherwise arrive faster than the master can service them.
export HADOOP_SLAVE_SLEEP=0.1
存储pid文件的目录。默认为/tmp
注意:这应该设置为只能写入的目录
要运行hadoop守护进程的用户。否则有
符号链接攻击的可能性。
export HADOOP_PID_DIR=/var/hadoop/pids
A string representing this instance of hadoop. $USER by default.
export HADOOP_IDENT_STRING=$USER
The scheduling priority for daemon processes. See 'man nice'.
export HADOOP_NICENESS=10
答案 0 :(得分:0)
导出JAVA_HOME =“C:\ Java \”
这表明您在Windows环境中使用hadoop。但是:
导出HADOOP_PID_DIR = / var / hadoop / pids
此行显示您在linux环境中提供文件夹位置。
不要对Windows和Linux配置感到困惑。 更正反映操作系统的路径。
答案 1 :(得分:0)
您在hadoop-env.sh中错误地设置了JAVA_HOME。给出java_home的绝对路径。你可以使用下面的命令找出java当前的java路径:
alternatives --config java
它将提供您安装的所有Java版本并选择正确的版本并将此java路径设置为hadoop-env.sh,如下所示:
export JAVA_HOME=/usr/java/jdk1.*/bin
另一种方法是将$ JAVA_HOME设置为用户的.bashrc。所以不需要设置为hadoop-env.sh。
答案 2 :(得分:0)
在机器中安装java并在.bash_profile上设置JAVA_HOME路径。执行此操作后,导出$ JAVA_HOME变量,以便其他正在查找JAVA_HOME的进程可以使用此路径。还要在/etc/hadoop/conf/hadoop-env.sh文件中设置JAVA_HOME路径。 注意: - 有些人安装了多个hadoop版本。确保在正确的hadoop系统中进行更改。要查看当前正在使用哪个hadoop系统,请在/ etc / alternatives文件上执行cat,这将提供所有符号链接并进行必要的更改。