由HDFS中的sqoop导入的序列文件未在Hive

时间:2016-06-02 05:20:38

标签: hive hdfs sqoop sequencefile

我使用--as-sequencefile选项将MySql中的表导入HDFS。然后我创建了一个带有STORED AS SEQUENCEFILE子句和LOCATION子句的Hive表,该子句指向存在Sqoop导入序列文件的HDFS位置。

Sqoop导入命令:

sqoop import --connect jdbc:mysql://sandbox.hortonworks.com:3306/hirw --username root --password hadoop --table stocks -m 2 --as-sequencefile  --target-dir /user/root/output/hirw/sqoopimport/stocks_seq --delete-target-dir

Hive Table Creation

CREATE TABLE stocks_sqoop_seq (id int, symbol string, name string, trade_date date, close_price float, volume int, update_time timestamp)  STORED AS SEQUENCEFILE LOCATION '/user/root/output/hirw/sqoopimport/stocks_seq';

当我尝试现在查询表时,它失败并带有异常

Failed with exception java.io.IOException:java.lang.RuntimeException: java.io.IOException: WritableName can't load class: stocks

我错过了什么

1 个答案:

答案 0 :(得分:0)

您还必须声明输入和输出格式。像这样创建表:

CREATE TABLE stocks_sqoop_seq (
  id int, symbol string, 
  name string, trade_date date, 
  close_price float, volume int, 
  update_time timestamp)  
STORED AS SEQUENCEFILE 
STORED AS INPUTFORMAT 'org.apache.hadoop.mapred.SequenceFileInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat'
LOCATION '/user/root/output/hirw/sqoopimport/stocks_seq'\;