SQOOP导出CSV到MySQL失败

时间:2016-02-20 10:19:04

标签: mysql csv hadoop sqoop

我在HDFS中有CSV文件,其中包含以下行:

"2015-12-01","Augusta","46728.0","1"

我正在尝试将此文件导出到MySQL表。

CREATE TABLE test.events_top10(
   dt VARCHAR(255),
   name VARCHAR(255),
   summary VARCHAR(255),
   row_number VARCHAR(255)
  );

使用命令:

sqoop export  --table events_top10 --export-dir /user/hive/warehouse/result --escaped-by \" --connect ...

此命令失败,错误:

Error: java.io.IOException: Can't export data, please check failed map task logs
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
    at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.RuntimeException: Can't parse input data: '2015-12-02,Ashburn,43040.0,9'
    at events_top10.__loadFromFields(events_top10.java:335)
    at events_top10.parse(events_top10.java:268)
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
    ... 10 more
Caused by: java.util.NoSuchElementException
    at java.util.ArrayList$Itr.next(ArrayList.java:834)
    at events_top10.__loadFromFields(events_top10.java:320)
    ... 12 more

如果我不使用 - escaped-by \“参数,那么MySQL表包含这样的行

"2015-12-01" | "Augusta"       | "46728.0" | "1" 

请问您能解释如何在没有双引号的情况下将CSV文件导出到MySQL表吗?

1 个答案:

答案 0 :(得分:1)

我必须同时使用--escaped-by \和--enclosed-by' \"' 所以正确的命令是

sqoop export  --table events_top10 --export-dir /user/hive/warehouse/result  --escaped-by '\\' --enclosed-by '\"'  --connect ...

有关详细信息,请参阅official documentation