如何阻止SPARK设置新的Hive VERSION

时间:2017-06-10 06:23:51

标签: apache-spark hive

我的HIVE Metastore版本是2.1.0。但是当我启动我的Spark-shell时,它将版本更新为1.2.0。

17/06/11 12:04:03 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/root/spark-2.1.1-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/root/spark/jars/datanucleus-core-3.2.10.jar."
17/06/11 12:04:07 ERROR metastore.ObjectStore: Version information found in metastore differs 2.1.0 from expected schema version 1.2.0. Schema verififcation is disabled hive.metastore.schema.verification so setting version.
17/06/11 12:04:09 WARN metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException

这导致我的HIVE停止工作。 我试图在 spark-defaults.conf 中设置 spark.sql.hive.metastore.version 2.1.0 ....然后我的spark-shell无效。 请帮帮我这个

1 个答案:

答案 0 :(得分:0)

您应该可以通过更新hive-site.xml来禁用版本验证

{{1}}