我创建了一个oozie工作流来执行从mysql到hive系统的sqoop导入。
用于创建sqoop作业的Sqoop作业成功运行,但是当我尝试执行从MySQL导入Hive的作业时,它失败了。我附上了日志
sqoop --hive-import(失败的sqoop操作)会分两步进行。
首先将一个sqoop导入到HDFS目录(我的xml中引用的targetDir)。
然后移动此sqoop导入的输出并导入Hive。
当我通过oozie运行我的sqoop作业时,我在targetDir中看到一个_SUCCESS文件,表明sqoop导入成功。只有后期(第2步)失败。
我作为hue用户运行Oozie工作流程。
9020 [uber-SubtaskRunner] INFO org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive
9982 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - WARNING: Use "yarn jar" to launch YARN applications.
10278 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - SLF4J: Class path contains multiple SLF4J bindings.
10278 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - SLF4J: Found binding in [jar:file:/usr/lib/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
10278 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
10278 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
10281 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
12413 [Thread-112] INFO org.apache.sqoop.hive.HiveImport -
12413 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - Logging initialized using configuration in jar:file:/usr/lib/hive/lib/hive-common-2.1.0-amzn-0.jar!/hive-log4j2.properties Async: true
13750 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/tez/dag/api/SessionNotRunning
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:586)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:518)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at java.lang.reflect.Method.invoke(Method.java:498)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - Caused by: java.lang.ClassNotFoundException: org.apache.tez.dag.api.SessionNotRunning
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
13751 [Thread-112] INFO org.apache.sqoop.hive.HiveImport - ... 10 more
14098 [uber-SubtaskRunner] ERROR org.apache.sqoop.tool.CreateHiveTableTool - Encountered IOException running create table job: java.io.IOException: Hive exited with status 1
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:389)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:339)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:240)
at org.apache.sqoop.tool.CreateHiveTableTool.run(CreateHiveTableTool.java:58)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:455)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:344)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Intercepting System.exit(1)
<<< Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
答案 0 :(得分:0)
错误是因为sqoop无法找到配置单元。
尝试在运行此sqoop命令的用户的所有datanode中为hive设置env变量
编辑:
输入命令行
oozie admin -shareliblist hive
并检查tez-api-*.jar
是否存在。
如果没有,则转到hdfs路径/user/oozie/share/lib/lib_DATA/hive/
并检查tez-api-*.jar
是否存在。如果存在,则更新sharelib oozie admin -sharelibupdate
并再次检查
答案 1 :(得分:0)
Sqoop在运行时无法找到Tez jar(即&#34; org.apache.tez.dag.api.SessionNotRunning&#34;类,确切地说)。
请根据需要验证您是否有Tez罐。