我正在尝试在Hive中创建一个外部表,但不断收到以下错误:
create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/hive_test_1375711405.45852.txt";
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask (state=08S01,code=1)
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask (state=08S01,code=1)
Aborting command set because "force" is false and command failed: "create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/hive_test_1375711405.45852.txt";"
/tmp/hive_test_1375711405.45852.txt
的内容是:
abc\tdef
我通过beeline
命令行界面进行连接,该界面使用Thrift HiveServer2
。
系统:
答案 0 :(得分:3)
问题在于我将外部表指向HDFS中的文件而不是目录。隐藏的Hive错误消息真的让我失望。
解决方案是创建一个目录并将数据文件放在那里。要解决上述示例中的问题,您需要在/tmp/foobar
下创建一个目录并在其中放置hive_test_1375711405.45852.txt
。然后像这样创建表:
create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/foobar";
答案 1 :(得分:-1)
我们公司也遇到了类似的问题(Sentry,Hive和kerberos组合)。我们通过删除未完全定义的hdfs_url
中的所有特权来解决此问题。例如,我们将GRANT ALL ON URI '/user/test' TO ROLE test;
更改为GRANT ALL ON URI 'hdfs-ha-name:///user/test' TO ROLE test;
。
您可以在Hive数据库中找到特定URI的特权(在本例中为mysql)。