使用hadoop-functions.sh启动hadoop失败

时间:2018-08-28 10:04:28

标签: hadoop

我尝试启动hadoop,但失败,但未启动任何东西。遵循控制台日志。

Mac:sbin lqs2$ sh start-all.sh
/Users/lqs2/Library/hadoop-3.1.1/libexec/hadoop-functions.sh: line 398: 
syntax error near unexpected token `<'
/Users/lqs2/Library/hadoop-3.1.1/libexec/hadoop-functions.sh: line 398: 
`done < <(for text in "${input[@]}"; do'
/Users/lqs2/Library/hadoop-3.1.1/libexec/hadoop-config.sh: line 70: 
hadoop_deprecate_envvar: command not found
/Users/lqs2/Library/hadoop-3.1.1/libexec/hadoop-config.sh: line 87: 
hadoop_bootstrap: command not found
/Users/lqs2/Library/hadoop-3.1.1/libexec/hadoop-config.sh: line 104: 
hadoop_parse_args: command not found
/Users/lqs2/Library/hadoop-3.1.1/libexec/hadoop-config.sh: line 105: 
shift: : numeric argument required
WARNING: Attempting to start all Apache Hadoop daemons as lqs2 in 10 
seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.

我尝试了任何方法来解决它,但是什么也没做。甚至我重新安装了最新版本。但是错误是相同的。这几乎使我发疯。

任何答案都是有帮助的。谢谢。

1 个答案:

答案 0 :(得分:1)

Hadoop脚本需要bash而不是sh

$ chmod +x start-all.sh
$ ./start-all.sh

尽管我建议分别启动HDFS和YARN,以便您可以隔离其他问题

您还需要将Hadoop至少降级到最新的2.7版本,Spark才能正常工作