我们正在使用hortonworks发行版和sqoop 1.4.6。我正在尝试使用sqoop导出加载带有Avro文件的Oracle表,但是使用以下stacktrace失败
Error: java.io.IOException: Can't export data, please check failed map task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.lang.RuntimeException: Can't parse input data: 'Objavro.schema��{"type":"record"'
at HADOOP_CLAIMS.__loadFromFields(HADOOP_CLAIMS.java:208)
at HADOOP_CLAIMS.parse(HADOOP_CLAIMS.java:156)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
Caused by: java.lang.NumberFormatException
at java.math.BigDecimal.<init>(BigDecimal.java:470)
at java.math.BigDecimal.<init>(BigDecimal.java:739)
at HADOOP_CLAIMS.__loadFromFields(HADOOP_CLAIMS.java:205)
看起来它试图使用TextExportMapper而不是AvroExportMapper。我发现此问题应该在1.4.5版本中解决了这个问题 - https://issues.apache.org/jira/browse/SQOOP-1283
知道为什么在sqoop 1.4.6中继续发生这种情况?是否还有一个需要修补的不同组件?