我无法在hive 1.2上运行ALTER TABLE MY_EXTERNAL_TABLE RECOVER PARTITIONS;
,但是当我运行备用MSCK REPAIR TABLE MY_EXTERNAL_TABLE
时,它只是列出了Hive Meta Store中没有添加它的分区。基于来自hive-exec的源代码,我能够在org/apache/hadoop/hive/ql/parse/HiveParser.g:1001:1
下看到在RECOVER PARTITIONS的语法中没有令牌匹配。
请告诉我是否有办法在Hive 1.2上创建外部表后恢复所有分区。
ALTER TABLE MY_EXTERNAL_TABLE RECOVER PARTITIONS;
的堆栈跟踪:
NoViableAltException(26@[])
at org.apache.hadoop.hive.ql.parse.HiveParser.alterTableStatementSuffix(HiveParser.java:7946)
at org.apache.hadoop.hive.ql.parse.HiveParser.alterStatement(HiveParser.java:7409)
at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:2693)
at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1658)
at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1117)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:202)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:431)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:316)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1189)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1237)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1126)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1116)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:168)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:379)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:739)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:624)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
FAILED: ParseException line 1:45 cannot recognize input near 'recover' 'partitions' '<EOF>' in alter table statement
注意:我使用S3作为存储,使用HDP 2.4作为hadoop和Hive 1.2。
答案 0 :(得分:1)
您好花了一些时间调试得到了修复,原因是它没有通过MSCK添加分区,因为我的分区名称是驼峰情况(FileSystem区分大小写,但是hive将所有分区列名称视为小写),然而,一旦我的分区路径以小写形式出现,它就像一个魅力。
答案 1 :(得分:0)
从火花使用运行它:
spark.sql("ALTER TABLE MY_EXTERNAL_TABLE RECOVER PARTITIONS").
这会奏效。