HIVE -Error - 异常块大小无效或对于此实现而言太大:-40

时间:2018-06-04 01:36:09

标签: hadoop hive avro tweets

我正在尝试在avro表中加载hive中的tweeter数据。

我用:

创建了一个表
CREATE TABLE tweets
  ROW FORMAT SERDE
     'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
  STORED AS INPUTFORMAT
     'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
  OUTPUTFORMAT
     'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
  TBLPROPERTIES ('avro.schema.url'='file:///home/siva/TwitterDataAvroSchema.avsc') ;

TwitterDataAvroSchema.avsc文件包含以下数据模式:

{"type":"record",
 "name":"Doc",
 "doc":"adoc",
 "fields":[{"name":"id","type":"string"},
           {"name":"user_friends_count","type":["int","null"]},
           {"name":"user_location","type":["string","null"]},
           {"name":"user_description","type":["string","null"]},
           {"name":"user_statuses_count","type":["int","null"]},
           {"name":"user_followers_count","type":["int","null"]},
           {"name":"user_name","type":["string","null"]},
           {"name":"user_screen_name","type":["string","null"]},
           {"name":"created_at","type":["string","null"]},
           {"name":"text","type":["string","null"]},
           {"name":"retweet_count","type":["long","null"]},
           {"name":"retweeted","type":["boolean","null"]},
           {"name":"in_reply_to_user_id","type":["long","null"]},
           {"name":"source","type":["string","null"]},
           {"name":"in_reply_to_status_id","type":["long","null"]},
           {"name":"media_url_https","type":["string","null"]},
           {"name":"expanded_url","type":["string","null"]}
          ]
}

表已成功创建。

然后我使用以下命令在此表中加载数据:

LOAD DATA INPATH '/user/flume/tweets/FlumeData.*' OVERWRITE INTO TABLE tweets;

这也很成功。

然后我尝试使用以下命令访问该表的数据:

hive> select * from tweets limit 2;

我得到以下输出:

hive> select * from tweets limit 2;
OK
Failed with exception java.io.IOException:org.apache.avro.AvroRuntimeException: java.io.IOException: Block size invalid or too large for this implementation: -40
Time taken: 1.347 seconds

有人可以建议如何修复此错误吗?

0 个答案:

没有答案