I have a S3 bucket which contains data in day wise partitions. Each of the partition contains a days worth of data in Avro format .I need to load this data into a hive table.
How can I create a partitioned Avro hive table to store data for last 90 days.
I tried the below create table command
table creation command
hive> CREATE EXTERNAL TABLE fxgm_avro
> PARTITIONED BY (dt string)
> ROW FORMAT SERDE
> 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
> STORED AS INPUTFORMAT
> 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
> OUTPUTFORMAT
> 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
> LOCATION 's3://glbl_mkts/fxgm/bkt%3Dkinesis_fxgm_logs/format%3Davro/dt%3D2016-01-12/part-r-00000.avro'
> TBLPROPERTIES (
> 'avro.schema.url'='file://fxgm_avr/fxgm.avsc');
OK
Time taken: 3.535 seconds
ON从表中选择没有选择记录 蜂房> select * from fxgm_avro limit 10; 好 所用时间:0.473秒
When I tried to select from the table, no records is selected.
I wonder this is the correct way or something I am missing.