由于退出了appattempt_1561548360472_0002_000002的AM容器,应用程序application_1561548360472_0002失败了两次,退出代码为:1

时间:2019-06-27 05:57:57

标签: hadoop yarn

我正在使用Hadoop在单个节点上运行一个简单的字数统计程序。 但是,当我尝试运行该程序时,它给出了错误 10:13:47 INFO mapreduce.Job:Job job_1561548360472_0002因状态失败而失败,原因是:由于appattempt_1561548360472_0002_000002的AM容器,应用程序application_1561548360472_0002失败了两次,并退出了退出代码: 1个 诊断失败:容器启动异常。 容器编号:container_1561548360472_0002_02_000001 。当我看到application_1561548360472_0002的日志时,它向我显示了这样的内容:-

================================================ ===================================== LogType:stderr 日志上传时间:2019年6月27日星期四10:13:48 +0530 LogLength:174 日志内容: 错误:找不到或加载主类org.apache.hadoop.mapreduce.v2.app.MRAppMaster 造成原因:java.lang.NoClassDefFoundError:org / apache / hadoop / service / CompositeService LogType:stderr结尾

LogType:stdout 日志上传时间:2019年6月27日星期四10:13:48 +0530 LogLength:0 日志内容: LogType:stdout结束

容器:sanjay-php_40683_1561610628456上的container_1561548360472_0002_02_000001

LogType:stderr 日志上传时间:2019年6月27日星期四10:13:48 +0530 LogLength:174 日志内容: 错误:找不到或加载主类org.apache.hadoop.mapreduce.v2.app.MRAppMaster 造成原因:java.lang.NoClassDefFoundError:org / apache / hadoop / service / CompositeService LogType:stderr结尾

LogType:stdout 日志上传时间:2019年6月27日星期四10:13:48 +0530 LogLength:0 日志内容: LogType:stdout结束

我已经尝试了一切,但没有任何效果。 我的yarn-site.xml是:

    <!--
     Licensed under the Apache License, Version 2.0 (the "License");
      you may not use this file except in compliance with the License.
     You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 
  implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->
   <configuration>
      <property>
          <name>yarn.nodemanager.aux-services</name>
          <value>mapreduce_shuffle</value>
      </property>
      <property>
          <name>yarn.log-aggregation-enable</name>
          <value>true</value>
      </property>
      <property>
          <name>yarn.application.classpath</name>
        <value>
            %HADOOP_HOME%\etc\hadoop,
            %HADOOP_HOME%\share\hadoop\common\*,
            %HADOOP_HOME%\share\hadoop\common\lib\*,
            %HADOOP_HOME%\share\hadoop\hdfs\*,
            %HADOOP_HOME%\share\hadoop\hdfs\lib\*,
            %HADOOP_HOME%\share\hadoop\mapreduce\*,
            %HADOOP_HOME%\share\hadoop\mapreduce\lib\*,
            %HADOOP_HOME%\share\hadoop\yarn\*,
            %HADOOP_HOME%\share\hadoop\yarn\lib\*
       </value>
      </property>
      <property>
        <name>yarn.nodemanager.remote-app-log-dir-suffix</name>
        <value>/test</value>
      </property>
       <property>
    <name>yarn.nodemanager.log-aggregation.roll-monitoring-interval-seconds</name>
    <value>3600</value>
    </property>
    <!-- Site specific YARN configuration properties -->
   <property>
    <name>yarn.application.classpath</name>

<value>$HADOOP_CLIENT_CONF_DIR,$HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/*,$HADOOP_COMMON_HOME/lib/*,$HADOOP_HDFS_HOM
E/*,$HADOOP_HDFS_HOME/lib/*,$HADOOP_YARN_HOME/*,$HADOOP_YARN_HOME/lib/*,$HADOOP_MAPRED_HOME/*,$HADOOP_MAPRED_HOME/lib/*,$MR2_CLASSPATH</value>
    </property>
     <property>
     <name>yarn.resourcemanager.address</name>
      <value>127.0.0.1:8032</value>
    </property>
   <property>
      <name>yarn.resourcemanager.scheduler.address</name>
       <value>127.0.0.1:8030</value>
   </property>
    <property>
       <name>yarn.resourcemanager.resource-tracker.address</name>
       <value>127.0.0.1:8031</value>
     </property>


</configuration>

我的WC_Runner.java文件是:-

    import java.io.IOException;    
    import org.apache.hadoop.fs.Path;    
    import org.apache.hadoop.io.IntWritable;    
    import org.apache.hadoop.io.Text;    
    import org.apache.hadoop.mapred.FileInputFormat;    
    import org.apache.hadoop.mapred.FileOutputFormat;    
    import org.apache.hadoop.mapred.JobClient;    
    import org.apache.hadoop.mapred.JobConf;    
    import org.apache.hadoop.mapred.TextInputFormat;    
    import org.apache.hadoop.mapred.TextOutputFormat;    
    public class WC_Runner {    
        public static void main(String[] args) throws IOException{    
            JobConf conf = new JobConf(WC_Runner.class);    
            conf.setJobName("WordCount");    
            conf.setOutputKeyClass(Text.class);    
            conf.setOutputValueClass(IntWritable.class);            
            conf.setMapperClass(WC_Mapper.class);    
            conf.setCombinerClass(WC_Reducer.class);    
            conf.setReducerClass(WC_Reducer.class);         
            conf.setInputFormat(TextInputFormat.class);    
            conf.setOutputFormat(TextOutputFormat.class);           
            FileInputFormat.setInputPaths(conf,new Path(args[0]));    
            FileOutputFormat.setOutputPath(conf,new Path(args[1]));     
            JobClient.runJob(conf);    
        }    
    }    

请帮助我摆脱这个问题。谢谢

0 个答案:

没有答案