创建外部表时出现Hive错误(state = 08S01,code = 1)

时间:2013-08-05 14:48:31

标签: hadoop hive thrift

我正在尝试在Hive中创建一个外部表,但不断收到以下错误:

create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/hive_test_1375711405.45852.txt";
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask (state=08S01,code=1)
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask (state=08S01,code=1)
Aborting command set because "force" is false and command failed: "create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/hive_test_1375711405.45852.txt";"

/tmp/hive_test_1375711405.45852.txt的内容是:

abc\tdef

我通过beeline命令行界面进行连接,该界面使用Thrift HiveServer2

系统:

  • Hadoop 2.0.0-cdh4.3.0
  • Hive 0.10.0-cdh4.3.0
  • 直线0.10.0-cdh4.3.0
  • 客户端操作系统 - 红帽企业Linux服务器版本6.4(圣地亚哥)

2 个答案:

答案 0 :(得分:3)

问题在于我将外部表指向HDFS中的文件而不是目录。隐藏的Hive错误消息真的让我失望。

解决方案是创建一个目录并将数据文件放在那里。要解决上述示例中的问题,您需要在/tmp/foobar下创建一个目录并在其中放置hive_test_1375711405.45852.txt。然后像这样创建表:

create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/foobar";

答案 1 :(得分:-1)

我们公司也遇到了类似的问题(Sentry,Hive和kerberos组合)。我们通过删除未完全定义的hdfs_url中的所有特权来解决此问题。例如,我们将GRANT ALL ON URI '/user/test' TO ROLE test;更改为GRANT ALL ON URI 'hdfs-ha-name:///user/test' TO ROLE test;

您可以在Hive数据库中找到特定URI的特权(在本例中为mysql)。