用于Spark中的Hive查询的HIVE_STATS_JDBC_TIMEOUT

时间:2018-03-08 15:56:47

标签: hadoop apache-spark hive

我刚刚使用Hive 2.3.2和Spark 2.3设置了一个新的hadoop 3.0集群。当我想在Hive表上运行一些查询时,会出现以下错误。

我知道Hive中有一些错误,但似乎已经修复了2.1.1,但不确定2.3.2版本的情况。你有什么想法可以以某种方式处理吗?

由于

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_151)
Type in expressions to have them evaluated.
Type :help for more information.

scala> import spark.sql
import spark.sql

scala> sql("show databases")
java.lang.NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT
  at org.apache.spark.sql.hive.HiveUtils$.formatTimeVarsForHiveClient(HiveUtils.scala:205)
  at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
  at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
  at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
  at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
  at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
  at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
  at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
  at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
  at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
  at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
  at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
  at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
  at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
  at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
  at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
  at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
  at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
  at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
  at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
  at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
  at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
  at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
  at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:638)
  ... 49 elided

1 个答案:

答案 0 :(得分:0)

我正在使用Hive 2.3.2运行spar 2.3并遇到类似的问题。

您提到的修复程序适用于Hive 2.1,可以从Spark Jira中看到:

https://issues.apache.org/jira/browse/SPARK-13446

您可以从最新评论中看到人们得到的错误与您的完全相同。

此外,正如此so问题所解答的那样,Spark支持的当前Hive版本为2.1