背景:
蜂房-site.xml中:
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://www.test.com:3306/metastore</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>password</value>
</property>
<property>
<name>hive.metastore.uris</name>
<value>thrift://www.test.com:9083</value>
</property>
<property>
<name>hive.metastore.schema.verification</name>
<value>true</value>
</property>
</configuration>
我将hive-site.xml放入hive / conf /和spark / conf /
但是运行start-thriftserver.sh,我得到错误日志(在spark_home / logs / spark- -HiveThriftServer2 .out):
......
INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
INFO metastore: Trying to connect to metastore with URI thrift://www.test.com:9083
INFO metastore: Connected to metastore.
......
DEBUG ObjectStore: Overriding javax.jdo.option.ConnectionURL value null from jpox.properties with jdbc:derby:memory:;databaseName=/tmp/spark-37dcab7f-655a-4506-abd7-492a8620a33e/metastore;create=true
......
INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
......
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
在日志中我们可以看到属性&#34; hive.metastore.uris&#34;做了。
但似乎其他mysql相关属性不起作用,Metastore仍然使用默认的数据库derby。
任何帮助表示赞赏!感谢。
解决:
我删除了hive.metastore.uris,因为我刚刚在本地使用了hive。
<property>
<name>hive.metastore.uris</name>
<value>thrift://www.test.com:9083</value>
</property>
&#13;
将hive.metasotre.schema.verification设置为false。感谢Nirmal的帮助。
答案 0 :(得分:1)
发现此错误日志
Caused by: MetaException(message:Version information not found in metastore
集,
的价值
hive-site.xml中的hive.metastore.schema.verification
为false,hive和spark配置并重新启动服务,然后重试