我尝试使用VMware Fusion在 macOS 上的 CentOS6.8 上构建spark环境。
我之前安装了 jdk-10.0.1 , hadoop-2.4.1.tar.gz 和 MySQL 。他们成功地工作了。
但是,当我尝试格式化matadata存储库时,
schematool -dbType mysql -initSchema
发生了以下错误:
which: no hbase in (/opt/hive/bin:/usr/local/hive/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/java/latest/bin:/usr/local/hadoop/sbin:/usr/local/hadoop/bin:/usr/local/hive/bin:/root/bin:/usr/java/latest/bin:/usr/local/hadoop/sbin:/usr/local/hadoop/bin:/opt/hive/bin)
^HMetastore connection URL: jdbc:mysql://hbase01:3306/myhive?createDatabaseIfNotExist=true
Metastore Connection Driver : com.mysql.jdbc.Driver
Metastore connection User: myhive
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to load driver
Underlying cause: java.lang.ClassNotFoundException : com.mysql.jdbc.Driver
Use --verbose for detailed stacktrace.
*** schemaTool failed ***
我忽略它并输入“$ HIVE_HOME / bin / hive”,发生以下错误:
which: no hbase in (/opt/hive/bin:/usr/local/hive/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/java/latest/bin:/usr/local/hadoop/sbin:/usr/local/hadoop/bin:/usr/local/hive/bin:/root/bin:/usr/java/latest/bin:/usr/local/hadoop/sbin:/usr/local/hadoop/bin:/opt/hive/bin)
Exception in thread "main" java.lang.ClassCastException: java.base/jdk.internal.loader.ClassLoaders$AppClassLoader cannot be cast to java.base/java.net.URLClassLoader
at org.apache.hadoop.hive.ql.session.SessionState.<init>(SessionState.java:387)
at org.apache.hadoop.hive.ql.session.SessionState.<init>(SessionState.java:363)
at org.apache.hadoop.hive.cli.CliSessionState.<init>(CliSessionState.java:60)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:663)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
我确信我已将mysql-connector-java放入hive的lib文件夹中。但仍有问题。
感谢任何帮助。非常感谢!
答案 0 :(得分:0)
Underlying cause: java.lang.ClassNotFoundException : com.mysql.jdbc.Driver
以上错误显示您必须缺少mysql.jdbc.driver jar。
wget http://www.java2s.com/Code/JarDownload/mysql/mysql-connector-java-commercial-5.1.7-bin.jar.zip unzip mysql-connector-java-commercial-5.1.7-bin.jar.zip cp mysql-connector-java-commercial-5.1.7-bin.jar $HIVE_HOME/lib/
sudo mysql
create database hive DEFAULT CHARACTER SET utf8; mysql> grant all PRIVILEGES on *.* TO 'hive'@'localhost' IDENTIFIED BY 'password_for_hive' WITH GRANT OPTION;
gedit $HIVE_HOME/conf/hive-site.xml
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hive</value>
<description>JDBC connection string used by Hive Metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>JDBC Driver class</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
<description>Metastore database user name</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>password_hive</value>
<description>Metastore database password</description>
</property>
<property>
<name>hive.metastore.uris</name>
<value>thrift://localhost:9084</value>
<description>Thrift server hostname and port</description>
</property>
##保存文件并关闭它
<property>
<name>hive.exec.scratchdir</name>
<value>/home/path/to/apache-hive-2.3.2-bin/iotmp</value>
<description>HDFS root scratch dir for Hive jobs which gets created with write all (733) permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/<username> is created, with ${hive.scratch.dir.permission}.</description>
</property>
<name>hive.exec.local.scratchdir</name>
<value>/home/path/to/apache-hive-2.3.2-bin/iotmp</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>/home/path/to/apache-hive-2.3.2-bin/iotmp</value>
<description>Temporary local directory for added resources in the remote file system.</description>
</property>
export METASTORE_PORT=9084
HIVE_HOME/bin/schematool -initSchema -dbType $databaseType $ hive
--service metastore &
hive