我已经使用此链接创建了一个4节点集群:https://blog.insightdatascience.com/spinning-up-a-free-hadoop-cluster-step-by-step-c406d56bae42,但是一旦我到达启动hadoop集群的部分,我就会遇到错误:
$ HADOOP_HOME / sbin目录/ start-dfs.sh
String htmlTable = table().with(
tr().with(
td().with(
img().withSrc("path" + "photo")
),
td().with(
span("name")
),
td().with(
span(String.valueOf("quantity"))
)
)
).render();
以下是我运行jps时会发生的事情:
Starting namenodes on [namenode_dns]
namenode_dns: mkdir: cannot create
directory ‘/usr/local/hadoop/logs’: Permission denied
namenode_dns: chown: cannot access
'/usr/local/hadoop/logs': No such file or directory
namenode_dns: starting namenode, logging
to /usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out
namenode_dns:
/usr/local/hadoop/sbin/hadoop-daemon.sh: line 159:
/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out: No
such file or directory
namenode_dns: head: cannot open
'/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out'
for reading: No such file or directory
namenode_dns:
/usr/local/hadoop/sbin/hadoop-daemon.sh: line 177:
/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out: No
such file or directory
namenode_dns:
/usr/local/hadoop/sbin/hadoop-daemon.sh: line 178:
/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out: No
such file or directory
ip-172-31-1-82: starting datanode, logging to
/usr/local/hadoop/logs/hadoop-ubuntu-datanode-ip-172-31-1-82.out
ip-172-31-7-221: starting datanode, logging to
/usr/local/hadoop/logs/hadoop-ubuntu-datanode-ip-172-31-7-221.out
ip-172-31-14-230: starting datanode, logging to
/usr/local/hadoop/logs/hadoop-ubuntu-datanode-ip-172-31-14-230.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: mkdir: cannot create directory ‘/usr/local/hadoop/logs’:
Permission denied
0.0.0.0: chown: cannot access '/usr/local/hadoop/logs': No such file
or directory
0.0.0.0: starting secondarynamenode, logging to
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159:
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out: No such file or directory
0.0.0.0: head: cannot open '/usr/local/hadoop/logs/hadoop-ubuntu-
secondarynamenode-ip-172-31-2-168.out' for reading: No such file or
directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177:
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out: No such file or directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178:
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out: No such file or directory
我不确定我的配置出错了。我是hadoop和map的新手,所以请保持简单。
答案 0 :(得分:2)
这是一个与权限相关的问题,看起来您用来启动hadoop服务的用户(我用它的ubuntu)在日志目录中没有写入权限(/ usr / local / hadoop) - 你会将hadoop文件复制为sudo / root。尝试递归更改Hadoop Home目录所有权或授予对/ usr / local / hadoop / logs目录的写访问权。
chown -R ububunt:ubuntu /usr/local/hadoop
或
chmod 777 /usr/local/hadoop/logs