在ubuntu上安装配置单元(德比有问题吗?)

时间:2016-07-18 10:20:34

标签: java linux apache-spark hive ubuntu-16.04

我已经安装了Hadoop,Spark,R,Rstudio-server和SparkR,我现在正在尝试安装Hive。

按照互联网上的教程,我在做什么:

$ cd /home/francois-ubuntu/media/
$ mkdir install-hive
$ cd install-hive
$ wget http://mirrors.ircam.fr/pub/apache/hive/hive-2.1.0/apache-hive-2.1.0-bin.tar.gz
$ tar -xzvf apache-hive-2.1.0-bin.tar.gz
$ mkdir /usr/lib/hive
$ mv apache-hive-2.1.0-bin /usr/lib/hive
$ cd
$ rm -rf /home/francois-ubuntu/media/install-hive
$ sudo vim ~/.bashrc

.bashrc中,我写了以下内容(我还包括相对于Java,Hadoop和Spark的行,也许它可能会有所帮助):

# Set JAVA_HOME
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64

# Set HADOOP_HOME
alias hadoop=/usr/local/hadoop/bin/hadoop
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin

# Set SPARK_HOME
export SPARK_HOME=/usr/local/spark

# Set HIVE_HOME
export HIVE_HOME=/usr/lib/hive/apache-hive-2.1.0-bin
PATH=$PATH:$HIVE_HOME/bin
export PATH

返回CLI:

$ cd /usr/lib/hive/apache-hive-2.1.0-bin/bin
$ sudo vim hive-config.sh

在hive-config.sh中,我添加:

export HADOOP_HOME=/usr/local/hadoop

然后:wq,返回CLI:

$ hadoop fs -mkdir /usr/hive/warehouse
$ hadoop fs -chmod g+w /usr/hive/warehouse

最后:

$ hive

这是我得到的:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/hive-common-2.1.0.jar!/hive-log4j2.properties Async: true
Mon Jul 18 12:13:44 CEST 2016 Thread[main,5,main] java.io.FileNotFoundException: derby.log (Permission denied)
----------------------------------------------------------------
Mon Jul 18 12:13:45 CEST 2016:
Booting Derby (version The Apache Software Foundation - Apache Derby - 10.10.2.0 - (1582446)) instance a816c00e-0155-fd7f-479a-0000040c9aa0 
on database directory /usr/lib/hive/apache-hive-2.1.0-bin/bin/metastore_db in READ ONLY mode with class loader sun.misc.Launcher$AppClassLoader@2e5c649. 
Loaded from file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/derby-10.10.2.0.jar.
java.vendor=Oracle Corporation
java.runtime.version=1.8.0_91-8u91-b14-0ubuntu4~16.04.1-b14
user.dir=/usr/lib/hive/apache-hive-2.1.0-bin/bin
os.name=Linux
os.arch=amd64
os.version=4.4.0-28-generic
derby.system.home=null
Database Class Loader started - derby.database.classpath=''

然后......没什么,它停在那里。根据教程,我现在应该有hive提示符(hive>),但我没有,我尝试了一些hive命令,他们不能工作。我也没有经典的CLI提示,没有提示,我可以输入内容,但我无法执行任何操作。似乎我唯一能做的就是用CTRL + C来阻止它。

知道什么是错的吗?

感谢。

修改1:

在@Hawknight的this advice之后,我按照here给出的帮助,执行了以下操作:

sudo addgroup hive
sudo useradd -g hive hive
sudo adduser hive sudo
sudo mkdir /home/hive
sudo chown -R hive:hive /home/hive
sudo chown -R hive:hive /usr/lib/hive/
visudo

将此行添加到sudoers文件中:

hive ALL=(ALL) NOPASSWD:ALL

然后,回到CLI:

sudo su hive
hive

但我仍然遇到同样的问题。

编辑2:

按照here的建议,我现在得到了一个不同的错误。错误输出很长,我觉得复制一切可能没用,因为其他错误可能源于第一个,所以这里是开头:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/hive-common-2.1.0.jar!/hive-log4j2.properties Async: true
Mon Jul 18 18:03:44 CEST 2016 Thread[main,5,main] java.io.FileNotFoundException: derby.log (Permission denied)
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:578)
    at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:518)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)
    at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)
    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:545)
    ... 9 more

请告诉我您是否需要其余的错误日志。

1 个答案:

答案 0 :(得分:1)

使用的实际SLF4J绑定是Log4j2。为此,您需要在类路径上匹配log4j-api和log4j-core依赖项。您还需要在类路径中配置log4j2.xml,因为默认情况下只会将ERROR消息打印到控制台。 Log4j2手册有许多示例配置。

您可能还想从类路径中删除slf4j-log4j12-1.7.10.jar。