将mysql数据导入HDFS时出现异常

时间:2019-05-09 13:27:07

标签: mysql hadoop sqoop

我正在尝试将MySQL数据导入HDFS,但出现异常。

I have a table(products) in MYSQL and I am using the following command to import data into HDFS.

bin/sqoop-import --connect jdbc:mysql://localhost:3306/test --username root --password root --table products --target-dir /user/nitin/products

I am getting the following exception:

Error: java.io.IOException: SQLException in nextKeyValue
    at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)`enter code here`
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:553)
    at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
    at org.apache.hadoop.mapreduce.lib.`enter code here`map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.sql.SQLException: Unknown type '246 in column 2 of 3 in binary-encoded result set.
    at com.mysql.jdbc.MysqlIO.extractNativeEncodedColumn(MysqlIO.java:3710)
    at com.mysql.jdbc.MysqlIO.unpackBinaryResultSetRow(MysqlIO.java:3620)
    at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1282)
    at com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:335)
    at com.mysql.jdbc.RowDataDynamic.<init>(RowDataDynamic.java:68)
    at com.mysql.jdbc.MysqlIO.getResultSet(MysqlIO.java:416)
    at com.mysql.jdbc.MysqlIO.readResultsForQueryOrUpdate(MysqlIO.java:1899)
    at com.mysql.jdbc.MysqlIO.readAllResults(MysqlIO.java:1347)
    at com.mysql.jdbc.ServerPreparedStatement.serverExecute(ServerPreparedStatement.java:1393)
    at com.mysql.jdbc.ServerPreparedStatement.executeInternal(ServerPreparedStatement.java:958)
    at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1705)
    at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
    at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
    ... 12 more

我还使用了以下命令将数据导入HDFS:

bin / sqoop-import --connect jdbc:mysql:// localhost:3306 / test?zeroDateTimeBehavior = convertToNull --username root --password root --table products --target-dir / user / nitin / product < / p>

MapReduce作业失败。

1 个答案:

答案 0 :(得分:1)

这是因为数据类型转换。 尝试使用--map-column-java选项明确定义列数据类型映射