通过Sqoop在MySQL与Hive之间传输批量数据时出现列类型问题

时间:2019-03-25 17:33:51

标签: mysql hive sqoop

我正在尝试从MySQL思想Sqoop向Hive数据库发送数据。运行后,该执行停止,列类型出现问题:

19/03/25 14:26:37 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table piwik_archive_blob_2019_03
19/03/25 14:26:37 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `piwik_archive_blob_2019_03` AS t LIMIT 1
19/03/25 14:26:37 WARN hive.TableDefWriter: Column date1 had to be cast to a less precise type in Hive
19/03/25 14:26:37 WARN hive.TableDefWriter: Column date2 had to be cast to a less precise type in Hive
19/03/25 14:26:37 WARN hive.TableDefWriter: Column ts_archived had to be cast to a less precise type in Hive
19/03/25 14:26:37 ERROR tool.ImportTool: Import failed: java.io.IOException: Hive does not support the SQL type for column value
  at org.apache.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:191)
  at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:189)
  at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
  at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
  at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
  at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
  at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
  at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
  at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

我的导入命令具有映射列:

sqoop import \
  -Dorg.apache.sqoop.splitter.allow_text_splitter=true \
  --connect jdbc:mysql://172.18.9.81:3306/piwik \
  --hive-import \
  --hive-table piwik.piwik_archive_blob_2019_03 \
  --map-column-hive date1=Date,date2=Date,ts_archived=Date \
  --password 'root' \
  --table piwik_archive_blob_2019_03 \
  --username root \
  -m 1

0 个答案:

没有答案