使用jdbc连接cloudera docker中的hive

时间:2017-04-24 11:59:35

标签: jdbc hive cloudera-quickstart-vm

我在本地安装了一个cloudera docker容器,也配置了hive端口,就像这样 docker run --hostname=quickstart.cloudera --privileged=true -t -i -p 8888:8888 -p 80:80 -p 10000:10000 --name cloudera2 cloudera/quickstart /usr/bin/docker-quickstart

我想用JDBC连接它,我的代码是这样的,

val driver = "org.apache.hive.jdbc.HiveDriver"
val url = "jdbc:hive2://localhost:10000/default"
val username = ""
val password = ""

// there's probably a better way to do this
var connection: Connection = null

try {
  // make the connection
  Class.forName(driver)

} catch {
  case e => e.printStackTrace
}
connection = DriverManager.getConnection(url, username, password)
connection.close()

但在我尝试执行时发生NoClassDefFoundError

log4j:WARN No appenders could be found for logger (org.apache.hive.jdbc.Utils).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
    at org.apache.hive.jdbc.HiveConnection.createUnderlyingTransport(HiveConnection.java:362)
    at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:382)
    at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:193)
    at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:167)
    at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
    at java.sql.DriverManager.getConnection(DriverManager.java:664)
    at java.sql.DriverManager.getConnection(DriverManager.java:247)
    at ScalaJdbcConnectSelect$.main(ScalaJdbcConnectSelect.scala:32)

hive版本:

Hive 1.1.0-cdh5.7.0

maven依赖

    <dependency>
        <groupId>org.apache.hive</groupId>
        <artifactId>hive-jdbc</artifactId>
        <version>1.1.0-cdh5.7.0</version>
    </dependency>

我不确定是因为用户名和密码,但我已尝试"cloudera","cloudera" "hive","", and "",""

1 个答案:

答案 0 :(得分:0)

我发现需要添加hadoop公共依赖,在我的例子中如下。

    <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.6.0-cdh5.7.0</version>
    </dependency>

它运作正常。