INSERT OVERWRITE上的Hive查询失败

时间:2014-05-22 14:09:09

标签: java jdbc hive

我正在使用jdbc连接在hive(v 0.11)上运行查询。代码如下:

Connection con = DriverManager.getConnection(
                "jdbc:hive://192.168.1.10:10000", "", "");
Statement stmt = con.createStatement();
stmt.execute("some query");

成功运行以下查询:

CREATE TABLE testdb.test(name string,id int);

SELECT * FROM testdb.test;

但是在执行包含INSERT OVERWRITE子句的任何查询时失败。例如:

INSERT OVERWRITE DIRECTORY '/user/jim/dir' SELECT * FROM space.test;

INSERT OVERWRITE TABLE testdb.t2 select name,id from testdb.test;

以下跟踪:

java.sql.SQLException: Query returned non-zero code: 1, cause: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
at org.apache.hadoop.hive.jdbc.HivePreparedStatement.executeImmediate(HivePreparedStatement.java:178)
at org.apache.hadoop.hive.jdbc.HivePreparedStatement.executeQuery(HivePreparedStatement.java:141)
at my.pack.test.HiveTest.main(HiveTest.java:31)
  Caused by: HiveServerException(message:Query returned non-zero code: 1, cause: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask, errorCode:1, SQLState:08S01)
at org.apache.hadoop.hive.service.ThriftHive$execute_result$execute_resultStandardScheme.read(ThriftHive.java:1494)
at org.apache.hadoop.hive.service.ThriftHive$execute_result$execute_resultStandardScheme.read(ThriftHive.java:1480)
at org.apache.hadoop.hive.service.ThriftHive$execute_result.read(ThriftHive.java:1430)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
at org.apache.hadoop.hive.service.ThriftHive$Client.recv_execute(ThriftHive.java:116)
at org.apache.hadoop.hive.service.ThriftHive$Client.execute(ThriftHive.java:103)
at org.apache.hadoop.hive.jdbc.HivePreparedStatement.executeImmediate(HivePreparedStatement.java:176)
... 2 more

主要问题是这些查询可以从hive控制台成功执行。

如果我在这里遗漏了什么,请帮助任何人。或者有一些更好的方法来实现这个jdbc?

N.B。 - 上述块中的每个查询都是单独执行的,不带分号。我只是为了便于阅读而放置它们。

1 个答案:

答案 0 :(得分:2)

您好我尝试了您的示例案例,它可以正常运行,在执行JDBC Client的查询时使用:

String sql = "INSERT OVERWRITE DIRECTORY '/user/jim/dir' select * from " + tableName;

stmt.execute(sql);

注意:

  1. 确保/ user / jim / dir是可写的,如果没有可写为

    hadoop fs -chmod a + rwx / user / jim / dir

  2. 使用 stmt.execute(sql)而非 stmt.executeQuery(sql);

  3. PS:问题仍然存在意味着让我知道,将分享完整的代码。