当我尝试在hadoop中启动节点时,我得到以下错误。
[hadoop@96 ~]$ start-dfs.sh
/usr/bin/env: bash: No such file or directory
Starting namenodes on []
/usr/bin/env: bash: No such file or directory
/usr/bin/env: bash: No such file or directory
/usr/bin/env: bash: No such file or directory
/home/hadoop/hadoop/hadoop-2.5.0-cdh5.3.1/sbin/start-dfs.sh: line 109: tr: command not found [hadoop@96 ~]$
如果您有任何想法,请告诉我?
添加start-dfs.sh文件以获取更多信息,我在tr命令中收到错误..
#---------------------------------------------------------
# ZK Failover controllers, if auto-HA is enabled
AUTOHA_ENABLED=$($HADOOP_PREFIX/bin/hdfs getconf -confKey dfs.ha.automatic-failover.enabled)
if [ "$(echo "$AUTOHA_ENABLED" | tr A-Z a-z)" = "true" ]; then
echo "Starting ZK Failover Controllers on NN hosts [$NAMENODES]"
"$HADOOP_PREFIX/sbin/hadoop-daemons.sh" \
--config "$HADOOP_CONF_DIR" \
--hostnames "$NAMENODES" \
--script "$bin/hdfs" start zkfc
fi
# eof