在运行SQOOP导出时,失败为 ORA-01438:大于此列允许的指定精度的值 我们尝试拆分数据和SQOOP它&我们能够看到失败的行。再次,如果我们尝试仅导出失败的行,它将成功导出。 我们最终失败了,再次失败了。 当我们尝试SQOOP EXPORT到Oracle时。 所有行--- 1689105 在此值之间失败的值大于此列允许的指定精度。 所以添加了row_id,发现172200已成功导出。 导出行172200& 172201它成功了。 如此分裂的HIVE表至172200和172200以上。 导出172200行时,它失败172177。 当导出172200行以上时,它在452729中失败。 单独检查行时也无法找到数据的任何错误。 使用SQOOP语句。
下面的错误日志,
2016-12-07 06:53:14,691 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2016-12-07 06:53:14,691 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Exception raised during data export
2016-12-07 06:53:14,691 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2016-12-07 06:53:14,691 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Exception:
java.io.IOException: java.sql.SQLDataException: ORA-01438: value larger than specified precision allowed for this column
at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:233)
at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:658)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:84)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1707)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.sql.SQLDataException: ORA-01438: value larger than specified precision allowed for this column
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:450)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:399)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1017)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:655)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:249)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:566)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:215)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:58)
at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:943)
at oracle.jdbc.driver.OraclePreparedStatement.executeForRowsWithTimeout(OraclePreparedStatement.java:10932)
at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:11043)
at oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:244)
at org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:231)
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: On input: 172177JACOB IZBICKI & CO. LTD101726825708\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N10.0\N\NNo PremiumNo Premium\N0.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.02343750.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.0\N\N0.00.00.00.00.00.00.01905.00.0No Data2005-05-25 00:00:00.02007-05-26 00:00:00.0\N2.02.0Low0.0Low1.32Low0.0LowNONEIncrease price to drive to profitabilityNULLNULLNULLNULLNULLNULLNULLNULLNULLNULLNULLNULL\N\N\N\N\N\N\N\N\N\N\N\N\N\N0.00.00.00.00.00.00.00.00.00.00.00.00.00.0
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: On input file: hdfs://dnvdevbigdata2.corp.nai.org:8020/user/hive/warehouse/mcafee_masterdatabase.db/wc_output_clv_detail_mdmparent1/000000_0
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: At position 82909184
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Currently processing split:
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Paths:/user/hive/warehouse/XXXXXX.db/XXXXXXX/000000_0:0+67108864,/user/hive/warehouse/mcafee_masterdatabase.db/wc_output_clv_detail_mdmparent1/000000_0:67108864+15820814,/user/hive/warehouse/XXXXXX.db/XXXXXX/000002_0:0+67108736
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: This issue might not necessarily be caused by current input
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: due to the batching nature of export.
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2016-12-07 06:53:14,692 INFO [Thread-11] org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is finished. keepGoing=false
使用SQOOP语句。
sqoop export --connect jdbc:oracle:thin:@odevbi-XXXXXXX:1521/XXXXX \
-username YYYY -password YYYY123 \
--table TEST_TABLE \
--input-fields-terminated-by '\001' \
--lines-terminated-by '\n' \
-m 1 \
--input-null-non-string "\\\N" \
--input-null-string "\\\N" \
--direct \
--export-dir /user/hive/warehouse/yyyyy.db/xxxxx