sqoop作业shell脚本在oozie中执行并行

时间:2017-04-26 01:32:53

标签: shell hdfs sqoop oozie oozie-coordinator

我有一个执行sqoop job的shell脚本。脚本如下。

!#/bin/bash

table=$1

sqoop job --exec ${table}

现在,当我在工作流程中传递表名时,我会成功执行sqoop作业。

工作流程如下。

<workflow-app name="Shell_script" xmlns="uri:oozie:workflow:0.5">
<start to="shell"/>
<kill name="Kill">
    <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<action name="shell_script">
    <shell xmlns="uri:oozie:shell-action:0.1">
        <job-tracker>${jobTracker}</job-tracker>
        <name-node>${nameNode}</name-node>
        <exec>sqoopjob.sh</exec>
        <argument>test123</argument>
        <file>/user/oozie/sqoop/lib/sqoopjob.sh#sqoopjob.sh</file>
    </shell>
    <ok to="End"/>
    <error to="Kill"/>
    </action>
    <end name="End"/>
</workflow-app>

test123成功执行作业。

现在我有300个像上面一样的sqoop工作。我想并行执行10个sqoop作业。所有表名都在一个文件中。

现在我想循环到该文件并为前10个表执行10 sqoop作业,依此类推。

我该怎么做?我应该准备10个工作流程吗?我真的很困惑。

1 个答案:

答案 0 :(得分:1)

提到@ Samson Scharfrichter,您可以在shell脚本中启动并行作业。 在shell中创建一个函数runJob()并并行运行它。 使用此模板:

#!/bin/bash

runJob() {
tableName="$1"
#add other parameters here

#call sqoop here or do something else
#write command logs
#etc, etc
#return 0 on success, return 1 on fail

return 0
}

#Run parallel processes and wait for their completion

#Add loop here or add more calls
runJob $table_name &
runJob $table_name2 &
runJob $table_name3 &
#Note the ampersand in above commands says to create parallel process

#Now wait for all processes to complete
FAILED=0

for job in `jobs -p`
do
   echo "job=$job"
   wait $job || let "FAILED+=1"
done

if [ "$FAILED" != "0" ]; then
    echo "Execution FAILED!  ($FAILED)"
    #Do something here, log or send messege, etc

    exit 1
fi

#All processes are completed successfully!
#Do something here
echo "Done successfully"