我目前有一个CDH 5.1 for virtual box的新图像,当我尝试使用spark shell连接到HBase时,我遇到了一个问题。这是scala代码:
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.client.{HBaseAdmin,HTable,Put,Get}
import org.apache.hadoop.hbase.util.Bytes
val conf = new HBaseConfiguration()
val admin = new HBaseAdmin(conf)
这是错误:
java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:416)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:393)
at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:274)
at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:192)
.
.
.
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:414)
... 43 more
Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:195)
at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:857)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:662)
... 48 more
Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 54 more
答案 0 :(得分:1)
这个问题是我没有正确设置我的spark配置。我不得不在火花配置中添加以下内容:
在spark-defaults.conf中:
spark.executor.extraClassPath /usr/lib/hive/lib/hive-hbase-handler.jar:/usr/lib/hbase/hbase-server.jar:/usr/lib/hbase/hbase-protocol.jar:/usr /lib/hbase/hbase-hadoop2-compat.jar:/usr/lib/hbase/hbase-client.jar:/usr/lib/hbase/hbase-common.jar:/usr/lib/hbase/lib/htrace- core.jar添加:在/ etc / HBase的/ CONF
在spark-env.sh中:
export SPARK_CLASSPATH = / usr / lib / hbase / hbase-server.jar:/usr/lib/hbase/hbase-protocol.jar:/usr/lib/hbase/hbase-hadoop2-compat.jar:/ usr / lib / hbase /hbase-client.jar:/usr/lib/hbase/hbase-common.jar:/usr/lib/hbase/lib/htrace-core.jar:/etc/hbase/conf
答案 1 :(得分:0)
可能是由NoSuchXXXError形式引起的版本问题。你的HBase 5.1也是吗?