我正在尝试将表从RDBMS(MYSQL或AS400)导入到Hive。 sqoop操作成功,但是在将数据导入到Hive时抛出异常。
下面是跟踪日志:
18/12/17 04:39:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM ROH319P4 AS t WHERE 1=0
18/12/17 04:39:58 WARN hive.TableDefWriter: Column SOUT19 had to be cast to a less precise type in Hive
18/12/17 04:39:58 WARN hive.TableDefWriter: Column DATE19 had to be cast to a less precise type in Hive
18/12/17 04:39:58 WARN hive.TableDefWriter: Column TIME19 had to be cast to a less precise type in Hive
18/12/17 04:39:58 INFO hive.HiveImport: Loading uploaded data into Hive
18/12/17 04:39:58 INFO conf.HiveConf: Found configuration file file:/etc/hive/conf.dist/hive-site.xml
18/12/17 04:39:59 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Exception thrown in Hive
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:360)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:240)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:333)
... 9 more
Caused by: java.lang.NoSuchMethodError: org.apache.logging.log4j.ThreadContext.getThreadContextMap()Lorg/apache/logging/log4j/spi/ThreadContextMap;
at org.apache.logging.log4j.ThreadContextAccess.getThreadContextMap(ThreadContextAccess.java:45)
at org.apache.logging.log4j.core.impl.ContextDataInjectorFactory.createDefaultInjector(ContextDataInjectorFactory.java:83)
at org.apache.logging.log4j.core.impl.ContextDataInjectorFactory.createInjector(ContextDataInjectorFactory.java:67)
at org.apache.logging.log4j.core.lookup.ContextMapLookup.<init>(ContextMapLookup.java:34)
at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:116)
at org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(AbstractConfiguration.java:120)
at org.apache.logging.log4j.core.config.NullConfiguration.<init>(NullConfiguration.java:32)
at org.apache.logging.log4j.core.LoggerContext.<clinit>(LoggerContext.java:72)
at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.createContext(ClassLoaderContextSelector.java:171)
at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.locateContext(ClassLoaderContextSelector.java:145)
at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:74)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:227)
at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:158)
at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:131)
at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:101)
at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:188)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:154)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:90)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:82)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:65)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:702)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
... 14 more
我的sqoop脚本:
sqoop import \
--libjars /usr/lib/sqoop/jt400-6.4.jar \
--driver com.ibm.as400.access.AS400JDBCDriver \
--connect "jdbc:as400://172.17.129.30/DB" \
--username xxxxxxx\
--password xxxx@9xxxxx \
--table ROH319P4 \
--columns "ITMC19, ITMD19, PIXT19, SOUT19, USER19, DATE19, TIME19, TERM19" \
--target-dir /user/hive/test100 \
--fields-terminated-by "," \
--hive-import -m 1
我什至检查了日志,重新启动了配置单元,但无法解决。奇怪的是,相同的代码可以正常工作,但是现在抛出异常。