Spark1.4.0无法连接到Hbase 1.1.0.1

时间:2015-07-01 11:20:09

标签: apache-spark hbase

它引发了错误

  

引起:java.lang.NoSuchMethodError:   org.apache.hadoop.hbase.util.Addressing.getIpAddress()Ljava /净/ InetAddress的;

虽然我可以通过使用spark shell成功连接到hbase。谁能知道问题出在哪里?

详细错误

1435748971395

sbt config:

15/07/01 18:57:57 ERROR yarn.ApplicationMaster: User class threw exception: java.io.IOException: java.lang.reflect.InvocationTargetException
java.io.IOException: java.lang.reflect.InvocationTargetException
    at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
    at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218)
    at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)
    at com.koudai.resys.tmp.HbaseLearning$.main(HbaseLearning.scala:22)
    at com.koudai.resys.tmp.HbaseLearning.main(HbaseLearning.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:483)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
    at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
    ... 9 more
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.util.Addressing.getIpAddress()Ljava/net/InetAddress;
    at org.apache.hadoop.hbase.client.ClientIdGenerator.getIpAddressBytes(ClientIdGenerator.java:83)
    at org.apache.hadoop.hbase.client.ClientIdGenerator.generateClientId(ClientIdGenerator.java:43)
    at org.apache.hadoop.hbase.client.PerClientRandomNonceGenerator.<init>(PerClientRandomNonceGenerator.java:37)
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:682)
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630)
    ... 14 more

正在运行的代码:

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.0" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.4.0" % "provided"
libraryDependencies += "org.apache.hbase" % "hbase" % "1.1.0.1"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.1.0.1"
libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.1.0.1"
libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.1.0.1"
libraryDependencies += "org.apache.hbase" % "hbase-hadoop2-compat" % "1.1.0.1"

使用反射列出org.apache.hadoop.hbase.util.Addressing的方法,发现它是hbase 0.94版本,可能来自哪里?

    val sc =  new SparkConf().setAppName("hbase@user")
    val conf = HBaseConfiguration.create()
    conf.set("hbase.zookeeper.property.clientPort", "2181")
    conf.set("hbase.zookeeper.quorum", "idc02-rs-sfa-10")

    // the error raised from here
    val conn = ConnectionFactory.createConnection(conf)

1 个答案:

答案 0 :(得分:0)

问题是类路径冲突,某处存在hbase-0.94与hbase-1.1.0.1冲突

提供了如何为他人识别此类问题的方法:

方式1:使用反射来识别类

  val methods = new org.apache.hadoop.hbase.util.Addressing().getClass.getMethods
  methods.foreach(method=>println(method.getName))

方式2:在云节点上打印类路径进行调试

    def urlses(cl: ClassLoader): Array[java.net.URL] = cl match {
        case null => Array()
        case u: java.net.URLClassLoader => u.getURLs() ++ urlses(cl.getParent)
        case _ => urlses(cl.getParent)
    }

    val  urls = urlses(getClass.getClassLoader)
    urls.foreach(url=>println(url.toString))