火花簇模式下的Impala JDBC连接问题

时间:2018-02-26 07:59:18

标签: apache-spark jdbc yarn impala

在集群模式下运行spark job时,Impala jdbc连接抛出异常。 Spark作业创建hive表并使用JDBC执行impala表invalidate / refresh。相同的作业在spark客户端模式下成功执行。

java.sql.SQLException: [Simba][ImpalaJDBCDriver](500164) Error initialized or created transport for authentication: [Simba][ImpalaJDBCDriver](500169) Unable to connect to server: GSS initiate failed. at om.cloudera.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source)
    at com.cloudera.hivecommon.api.HiveServer2ClientFactory.createClient(Unknown Source)
    at com.cloudera.hivecommon.core.HiveJDBCCommonConnection.connect(Unknown Source)
    at com.cloudera.impala.core.ImpalaJDBCConnection.connect(Unknown Source)
    at com.cloudera.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
    at com.cloudera.jdbc.common.AbstractDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:664)
    at java.sql.DriverManager.getConnection(DriverManager.java:270)

1 个答案:

答案 0 :(得分:1)

  protected def getImpalaConnection(impalaJdbcDriver: String, impalaJdbcUrl: String): Connection = {
if (impalaJdbcDriver.length() == 0) return null
try {
  Class.forName(impalaJdbcDriver).newInstance
  UserGroupInformation.getLoginUser.doAs(
    new PrivilegedAction[Connection] {
      override def run(): Connection = DriverManager.getConnection(impalaJdbcUrl)
    }
  )
} catch {
  case e: Exception => {
    println(e.toString() + " --> " + e.getStackTraceString)
    throw e
  }
} }

val   impalaJdbcDriver = "com.cloudera.impala.jdbc41.Driver"

val impalaJdbcUrl = "jdbc:impala://<Impala_Host>:21050/default;AuthMech=1;SSL=1;KrbRealm=HOST.COM;KrbHostFQDN=_HOST;KrbServiceName=impala;REQUEST_POOL=xyz"

println("Start impala connection")

val impalaConnection = getImpalaConnection(impalaJdbcDriver,impalaJdbcUrl)


val result = impalaConnection.createStatement.executeQuery(s"SELECT COUNT(1) FROM testTable")
println("End impala connection")

构建厚jar并使用下面给出的spark submit命令。如果需要,您可以传递其他参数,如文件,jar。

Spark提交命令:

spark-submit --master yarn-cluster --keytab /home/testuser/testuser.keytab --principal testuser@host.COM  --queue xyz--class com.dim.UpdateImpala

根据您的火花版本进行如下更改

对于Spark1:UserGroupInformation.getLoginUser

对于Spark2:UserGroupInformation.getCurrentUser