我正在尝试学习hadoop,我正在遵循复数视力中的“ hadoop构建基块”课程,当我运行以下命令时,我正在尝试运行hadoop槽式伪分布式模式:
bin / hadoop jar share / hadoop / mapreduce / hadoop-mapreduce-examples-3.0.3.jar grep输入输出'dfs [a-z。] +'
我得到以下输出:
Application application_1530031734419_0001 failed 2 times due to AM Container for appattempt_1530031734419_0001_000002 exited with exitCode: 1
Failing this attempt.Diagnostics: [2018-06-26 16:50:21.067]Exception from container-launch.
Container id: container_1530031734419_0001_02_000001
Exit code: 1
[2018-06-26 16:50:21.076]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster
Please check whether your etc/hadoop/mapred-site.xml contains the below configuration:
<property>
<name>yarn.app.mapreduce.am.env</name>
<value>HADOOP_MAPRED_HOME=${full path of your hadoop distribution directory}</value>
</property>
<property>
<name>mapreduce.map.env</name>
<value>HADOOP_MAPRED_HOME=${full path of your hadoop distribution directory}</value>
</property>
<property>
<name>mapreduce.reduce.env</name>
<value>HADOOP_MAPRED_HOME=${full path of your hadoop distribution directory}</value>
</property>
[2018-06-26 16:50:21.077]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster
Please check whether your etc/hadoop/mapred-site.xml contains the below configuration:
<property>
<name>yarn.app.mapreduce.am.env</name>
<value>HADOOP_MAPRED_HOME=${full path of your hadoop distribution directory}</value>
</property>
<property>
<name>mapreduce.map.env</name>
<value>HADOOP_MAPRED_HOME=${full path of your hadoop distribution directory}</value>
</property>
<property>
<name>mapreduce.reduce.env</name>
<value>HADOOP_MAPRED_HOME=${full path of your hadoop distribution directory}</value>
</property>
For more detailed output, check the application tracking page: http://homestead:8088/cluster/app/application_1530031734419_0001 Then click on links to logs of each attempt.
. Failing the application.
我尝试遵循此日志并搜索了类似的问题,基本上看来该错误出在某些配置文件中,所以它们是:
mapred-site.xml
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
<property>
<name>yarn.app.mapreduce.am.env</name>
<value>HADOOP_MAPRED_HOME=~/hadoop-install/hadoop-3.0.3</value>
</property>
<property>
<name>mapreduce.map.env</name>
<value>HADOOP_MAPRED_HOME=~/hadoop-install/hadoop-3.0.3</value>
</property>
<property>
<name>mapreduce.reduce.env</name>
<value>HADOOP_MAPRED_HOME=~/hadoop-install/hadoop-3.0.3</value>
</property>
<property>
<name>mapreduce.application.classpath</name>
<value>$HADOOP_MAPRED_HOME/,$HADOOP_MAPRED_HOME/lib/,$MR2_CLASS</value>
</property>
</configuration>
hadoop-env.sh //由于此文件很大,因此,我要带上我对默认文件所做的更改
export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-amd64
export HADOOP_HOME=~/hadoop-install/hadoop-3.0.3
export YARN_HOME=$HADOOP_HOME
plus:sbin / start-all.sh中的所有进程正在运行。该操作系统为ubuntu,但通过Windows 10计算机上的虚拟机运行。 hadoop版本是3.0.3
答案 0 :(得分:0)
说
${full path of your hadoop distribution directory}
1)您需要确保Hadoop集群中每台计算机上的路径都相同
2)完整路径意味着路径中没有~/