无法使用oozie运行shell脚本

时间:2015-09-02 06:14:41

标签: shell hadoop oozie oozie-coordinator sparkr

您好我正在尝试通过oozie运行shell脚本。运行shell脚本时,我收到以下错误。

org.apache.oozie.action.hadoop.ShellMain], exit code [1]

我的job.properties文件

nameNode=hdfs://ip-172-31-41-199.us-west-2.compute.internal:8020
jobTracker=ip-172-31-41-199.us-west-2.compute.internal:8032
queueName=default
oozie.libpath=${nameNode}/user/oozie/share/lib/
oozie.use.system.libpath=true
oozie.wf.rerun.failnodes=true
oozieProjectRoot=shell_example
oozie.wf.application.path=${nameNode}/user/karun/${oozieProjectRoot}/apps/shell

我的workflow.xml

<workflow-app xmlns="uri:oozie:workflow:0.1" name="pi.R example">
<start to="shell-node"/>
<action name="shell-node">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<exec>script.sh</exec>
<file>/user/karun/oozie-oozi/script.sh#script.sh</file>
<capture-output/>
</shell>
<ok to="end"/>
<error to="fail"/>
 </action>
 <kill name="fail">
 <message>Incorrect output</message>
</kill>
<end name="end"/>
</workflow-app>

my shell script- script.sh

export SPARK_HOME=/opt/cloudera/parcels/CDH-5.4.2-1.cdh5.4.2.p0.2/lib/spark
export YARN_CONF_DIR=/etc/hadoop/conf
export JAVA_HOME=/usr/java/jdk1.7.0_67-cloudera
export HADOOP_CMD=/usr/bin/hadoop
/SparkR-pkg/lib/SparkR/sparkR-submit --master yarn-client examples/pi.R yarn-client 4 

错误日志文件

WEBHCAT_DEFAULT_XML=/opt/cloudera/parcels/CDH-5.4.2- 1.cdh5.4.2.p0.2/etc/hive-webhcat/conf.dist/webhcat-default.xml:
CDH_KMS_HOME=/opt/cloudera/parcels/CDH-5.4.2-1.cdh5.4.2.p0.2/lib/hadoop-kms:
LANG=en_US.UTF-8:
HADOOP_MAPRED_HOME=/opt/cloudera/parcels/CDH-5.4.2-  1.cdh5.4.2.p0.2/lib/hadoop-mapreduce:

=============================================== ==================

  
    
      

现在调用Shell命令行&gt;&gt;

    
  
Stdoutput Running /opt/cloudera/parcels/CDH-5.4.2-  
1.cdh5.4.2.p0.2/lib/spark/bin/spark-submit --class  edu.berkeley.cs.amplab.sparkr.SparkRRunner --files hdfs://ip-172-31-41-199.us-west-2.compute.internal:8020/user/karun/examples/pi.R --master yarn-client 
/SparkR-pkg/lib/SparkR/sparkr-assembly-0.1.jar hdfs://ip-172-31-41-199.us-west-  2.compute.internal:8020/user/karun/examples/pi.R yarn-client 4
Stdoutput Fatal error: cannot open file 'pi.R': No such file or directory
Exit code of the Shell command 2
<<< Invocation of Shell command completed <<<
<<< Invocation of Main class completed <<<
 Failing Oozie Launcher, Main class  [org.apache.oozie.action.hadoop.ShellMain], exit code [1]

 Oozie Launcher failed, finishing Hadoop job gracefully

 Oozie Launcher, uploading action data to HDFS sequence file: hdfs://ip-172-31-41-199.us-west-2.compute.internal:8020/user/karun/oozie-oozi/0000035-150722003725443-oozie-oozi-W/shell-node--shell/action-data.seq

 Oozie Launcher ends

我不知道如何解决这个问题。我们将不胜感激。

1 个答案:

答案 0 :(得分:1)

sparkR-submit  ...  examples/pi.R  ...
  

致命错误:无法打开文件'pi.R':没有这样的文件或目录

消息非常明确:您的shell尝试从本地文件系统中读取R脚本。但本地是什么,实际上是???

Oozie使用YARN来运行你的shell;所以YARN在随机机器上分配容器。这是你必须放在脑中的东西,这样它才能成为一种反射:Oozie Action (脚本,库,配置文件,等等)所需的所有资源必须是

  1. 预先在HDFS中提供
  2. 在执行时下载,感谢Oozie脚本中的<file>指令
  3. 作为当前工作目录中的本地文件进行访问
  4. 在你的情况下:

    <exec>script.sh</exec>
    <file>/user/karun/oozie-oozi/script.sh</file>
    <file>/user/karun/some/place/pi.R</file>
    

    然后

    sparkR-submit  ...  pi.R  ...