E0405:提交请求没有任何应用程序或库路径

时间:2019-09-27 07:55:03

标签: hadoop hdfs oozie oozie-workflow

这是第一次运行Oozie的mapreduce程序。

这是我的job.properties文件

nameNode=file:/usr/local/hadoop_store/hdfs/namenode
jobTracker=localhost:8088
queueName=default
oozie.wf.applications.path=${nameNode}/Config

这是我的hdfs-site.xml

<configuration>
 <property>
  <name>dfs.replication</name>
  <value>1</value>
  <description>Default block replication.
  The actual number of replications can be specified when the file is created.
  The default is used if replication is not specified in create time.
  </description>
 </property>
 <property>
   <name>dfs.namenode.name.dir</name>
   <value>file:/usr/local/hadoop_store/hdfs/namenode</value>
 </property>
 <property>
   <name>dfs.datanode.data.dir</name>
   <value>file:/usr/local/hadoop_store/hdfs/datanode</value>
 </property>
</configuration>

这是我的core-site.xml

<configuration>
 <property>
  <name>hadoop.tmp.dir</name>
  <value>/app/hadoop/tmp</value>
  <description>A base for other temporary directories.</description>
 </property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
<property>
<name>hadoop.proxyuser.hduser.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.hduser.groups</name>
<value>*</value>
</property>
</configuration>

但是,当我运行Ozzie命令来运行我的Mapreduce程序时,它给出了错误:找不到lib文件夹。 Error: E0405 : E0405: Submission request doesn't have any application or lib path

oozie job -oozie http://localhost:11000/oozie -config job.properties -run

我已经在Config中创建了HDFS文件夹,并且也在该文件夹中也创建了lib文件夹。在lib文件夹中放置我的mapreduce jar文件,在Config文件夹中放置我的workflow.xml文件。 (全部在HDFS中)

我认为我在nameNode的文件中输入了错误的HDFS路径(job.propertie)。这就是为什么它无法找到{nameNode}/Config的原因,请问那是hdfs path ..?

谢谢

更新-1个job.properties

nameNode=hdfs://localhost:8020
jobTracker=localhost:8088
queueName=default
oozie.wf.applications.path=${nameNode}/Config

仍然出现相同的错误:

Error: E0405 : E0405: Submission request doesn't have any application or lib path

更新-HDFS的workflow.xml文件夹中有2个Config

<workflow-app xmlns="uri:oozie:workflow:0.4" name="simple-Workflow">
   <start to="RunMapreduceJob" />
   <action name="RunMapreduceJob">
      <map-reduce>
         <job-tracker>localhost:8088</job-tracker>
         <name-node>file:/usr/local/hadoop_store/hdfs/namenode</name-node>
         <prepare>
            <delete path="file:/usr/local/hadoop_store/hdfs/namenode"/>
         </prepare>
         <configuration>
            <property>
               <name>mapred.job.queue.name</name>
               <value>default</value>
            </property>
            <property>
               <name>mapred.mapper.class</name>
               <value>DataDividerByUser.DataDividerMapper</value>
            </property>
            <property>
               <name>mapred.reducer.class</name>
               <value>DataDividerByUser.DataDividerReducer</value>
            </property>
            <property>
               <name>mapred.output.key.class</name>
               <value>org.apache.hadoop.io.IntWritable</value>
            </property>
            <property>
               <name>mapred.output.value.class</name>
               <value>org.apache.hadoop.io.Text</value>
            </property>
            <property>
               <name>mapred.input.dir</name>
               <value>/data</value>
            </property>
            <property>
               <name>mapred.output.dir</name>
               <value>/dataoutput</value>
            </property>
         </configuration>
      </map-reduce>
      <ok to="end" />
      <error to="fail" />
   </action>
   <kill name="fail">
      <message>Mapreduce program Failed</message>
   </kill>
   <end name="end" />
</workflow-app>

1 个答案:

答案 0 :(得分:1)

<namenode>标签不应是文件路径。它应指向Oozie必须在其中运行MapReduce作业的基础Hadoop群集的NameNode。您的名称节点应该是core-site.xml中fs.default.name的值。

nameNode=hdfs://localhost:9000

此外,将属性名称 oozie.wf.applications.path 更改为 oozie.wf.application.path (不带s)。

将属性oozie.use.system.libpath=true添加到属性文件中。

来源:Mohammad Kamrul Islam和Aravind Srinivasan的 Apache Oozie