无法使用Java连接到HBase

时间:2015-02-03 06:40:39

标签: java hbase apache-spark

我正在尝试使用Java连接HBase。只有一个节点,这是我自己的机器。好像我无法成功连接。

这是我的Java代码:

public class Test {
  public static void main(String[] args) throws MasterNotRunningException, ZooKeeperConnectionException, IOException, ServiceException {        
    SparkConf conf = new SparkConf().setAppName("Test").setMaster("spark://10.239.58.111:7077");
    JavaSparkContext sc = new JavaSparkContext(conf);
    sc.addJar("/home/cloudera/workspace/Test/target/Test-0.0.1-SNAPSHOT.jar");
    Configuration hbaseConf = HBaseConfiguration.create();
    hbaseConf.addResource(new Path("/usr/lib/hbase/conf/hbase-site.xml"));
    HTable table = new HTable(hbaseConf, "rdga_by_id");
  }
}

我试图在这样的代码中设置环境,

hbaseConf.set("hbase.master", "localhost");
hbaseConf.set("hbase.master.port", "60000");
hbaseConf.set("hbase.zookeeper.property.clientPort", "2181");
hbaseConf.set("hbase.zookeeper.quorum", "quickstart.cloudera");
hbaseConf.set("hbase.zookeeper.quorum", "localhost"); 

但仍然无效。

这是hbase-site.xml:

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
  <property>
    <name>hbase.rest.port</name>
    <value>8070</value>
    <description>The port for the HBase REST server.</description>
  </property>
  <property>
  <name>hbase.cluster.distributed</name>
    <value>true</value>
  </property>
  <property>
    <name>hbase.rootdir</name>
    <value>hdfs://quickstart.cloudera:8020/hbase</value>
  </property>
  <property>
    <name>hbase.regionserver.ipc.address</name>
    <value>0.0.0.0</value>
  </property>
  <property>
    <name>hbase.master.ipc.address</name>
    <value>0.0.0.0</value>
  </property>
  <property>
    <name>hbase.thrift.info.bindAddress</name>
    <value>0.0.0.0</value>
  </property>
</configuration>

在服务器运行的Web UI页面中,它表示serverName是“quickstart.cloudera,16201,1422941563375”。

错误就是

2015-02-02 22:17:03,121 INFO  [main] zookeeper.ZooKeeper (ZooKeeper.java:<init>(438)) - Initiating client connection, connectString=quickstart.cloudera:16201 sessionTimeout=90000 watcher=hconnection-0x62ad0636, quorum=quickstart.cloudera:16201, baseZNode=/hbase
Exception in thread "main" java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:413)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:390)
at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:271)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:198)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:160)
at Test.main(Test.java:52)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:411)
... 12 more
Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:216)
at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:839)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:642)
... 17 more
Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 23 more

很抱歉让你们看了这么多代码。提前致谢

4 个答案:

答案 0 :(得分:2)

引起:

java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace

基于错误堆栈跟踪中的这一行,包括htrace-core.jar可能有所帮助。

答案 1 :(得分:0)

对于Spark-HBase集成,最好的方法是将HBase库添加到Spark Classpath。这可以使用$ SPARK_HOME / bin文件夹中的'compute-classpath.sh'脚本来完成。 Spark调用'compute-classpath.sh'并因此获取所需的hbase jar。

export CLASSPATH=$CLASSPATH:<path/to/HBase/lib/*>

例如:export CLASSPATH = $ CLASSPATH:/ opt / cloudera / parcels / CDH / lib / hbase / lib / *

在此之后,重启Spark。

你去:)

答案 2 :(得分:0)

像这样提供这个jar的完整路径 - &gt;

sc.addJar("htrace-core.jar");

答案 3 :(得分:0)

从java库连接到Hbase时我也遇到了这个错误。我在下面添加了classpath,但它没有工作

   var retVal;
   function getEmployees(cb) {
        return mongoClient.connect(url, function(err, db) {
             if (err) return cb(err);
             db.collection('employees').find().toArray(
                 function(err, doc){
                     if(err) return cb(err);
                     console.log(doc); 
                     cb(null,doc); 
                  });
          });
        }

 getEmployees((err, result) => {
      if (err) console.log('Error:',err);
      retVal = result;
      return retVal;
 });

但随后我添加了hbase-solr路径,因为htrace jar存在于此路径中

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/lib/hbase/*

希望这对你有用。