oozie>火花动作>要求失败

时间:2016-07-01 11:15:30

标签: apache-spark oozie cloudera-cdh

我正在尝试使用spark动作启动oozie工作流程。

spark version: 1.6.0-cdh5.7.0
oozie version: 4.1.0-cdh5.7.0

工作流文件

 <action name="mapping-profiles">
    <spark xmlns="uri:oozie:spark-action:0.1">
        <job-tracker>${jobTracker}</job-tracker>
        <name-node>${nameNode}</name-node>
        <master>yarn-cluster</master>
        <mode>cluster</mode>
        <name>SparkJobName</name>
        <class>ru.some.package.SparkJobName</class>
        <jar>${nameNode}/user/workflows/loader/lib/my-fat-jar.jar</jar>
    </spark>
   <ok to="end"/>
    <error to="fail"/>
</action>

job.properties

jobTracker={{ hadoop_job_tracker }}
nameNode={{ hadoop_name_node_address }}
oozie.use.system.libpath=true
oozie.libpath=${nameNode}/user/oozie/share/lib/lib_20160517130634/spark
oozie.coord.application.path=/user/workflows/loader/coordinator.xml
user.name=hdfs
来自oozie的

设法得到这个日志:

2016-07-01 13:43:10,659 WARN org.apache.oozie.action.hadoop.SparkActionExecutor: SERVER[host] USER[hdfs] GROUP[-] TOKEN[] APP[loader-workflow] JOB[0000019-160701125700437-oozie-oozi-W] ACTION[0000019-160701125700437-oozie-oozi-W@mapping-profiles] Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, requirement failed
2016-07-01 13:43:10,659 WARN org.apache.oozie.action.hadoop.SparkActionExecutor: SERVER[host] USER[hdfs] GROUP[-] TOKEN[] APP[loader-workflow] JOB[0000019-160701125700437-oozie-oozi-W] ACTION[0000019-160701125700437-oozie-oozi-W@mapping-profiles] Launcher exception: requirement failed
java.lang.IllegalArgumentException: requirement failed
    at scala.Predef$.require(Predef.scala:221)
    at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$6$$anonfun$apply$3.apply(Client.scala:473)
    at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$6$$anonfun$apply$3.apply(Client.scala:471)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
    at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$6.apply(Client.scala:471)
    at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$6.apply(Client.scala:469)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:469)
    at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:725)
    at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:143)
    at org.apache.spark.deploy.yarn.Client.run(Client.scala:1023)
    at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1083)
    at org.apache.spark.deploy.yarn.Client.main(Client.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:185)
    at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:176)
    at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:49)
    at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:46)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:388)
    at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:302)
    at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187)
    at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

2016-07-01 13:43:10,824 DEBUG org.apache.oozie.command.wf.ActionEndXCommand: SERVER[host] USER[hdfs] GROUP[-] TOKEN[] APP[loader-workflow] JOB[0000019-160701125700437-oozie-oozi-W] ACTION[0000019-160701125700437-oozie-oozi-W@mapping-profiles] Execute command [action.end] key [0000019-160701125700437-oozie-oozi-W]
2016-07-01 13:43:10,825 DEBUG org.apache.oozie.command.wf.ActionEndXCommand: SERVER[host] USER[hdfs] GROUP[-] TOKEN[] APP[loader-workflow] JOB[0000019-160701125700437-oozie-oozi-W] ACTION[0000019-160701125700437-oozie-oozi-W@mapping-profiles] STARTED ActionEndXCommand for action 0000019-160701125700437-oozie-oozi-W@mapping-profiles
2016-07-01 13:43:10,829 DEBUG org.apache.oozie.command.wf.ActionEndXCommand: SERVER[host] USER[hdfs] GROUP[-] TOKEN[] APP[loader-workflow] JOB[0000019-160701125700437-oozie-oozi-W] ACTION[0000019-160701125700437-oozie-oozi-W@mapping-profiles] End, name [mapping-profiles] type [spark] status[DONE] external status [FAILED/KILLED] signal value [null]
2016-07-01 13:43:10,878 INFO org.apache.oozie.command.wf.ActionEndXCommand: SERVER[host] USER[hdfs] GROUP[-] TOKEN[] APP[loader-workflow] JOB[0000019-160701125700437-oozie-oozi-W] ACTION[0000019-160701125700437-oozie-oozi-W@mapping-profiles] ERROR is considered as FAILED for SLA
2016-07-01 13:43:11,052 DEBUG org.apache.oozie.command.wf.SignalXCommand: SERVER[host] USER[hdfs] GROUP[-] TOKEN[] APP[loader-workflow] JOB[0000019-160701125700437-oozie-oozi-W] ACTION[0000019-160701125700437-oozie-oozi-W@mapping-profiles] Execute command [signal] key [0000019-160701125700437-oozie-oozi-W]
2016-07-01 13:43:11,052 DEBUG org.apache.oozie.command.wf.SignalXCommand: SERVER[host] USER[hdfs] GROUP[-] TOKEN[] APP[loader-workflow] JOB[0000019-160701125700437-oozie-oozi-W] ACTION[0000019-160701125700437-oozie-oozi-W@mapping-profiles] STARTED SignalCommand for jobid=0000019-160701125700437-oozie-oozi-W, actionId=0000019-160701125700437-oozie-oozi-W@mapping-profiles

此处的源代码为org.apache.spark.deploy.yarn.Client

任何人都可以帮忙吗?

0 个答案:

没有答案