我正在尝试从Linux客户端连接到运行在不同服务器上的HBase服务器,我得到以下错误。代码在我的Windows笔记本电脑上工作正常,我能够连接到Hbase服务器并获得结果。我想我错过了一些我的linux服务器的依赖jar,因为当我添加hbase-client jar时,它可以从我的笔记本电脑上运行,这表明我的代码逻辑是正确的。正如我从笔记本电脑验证的那样,所有配置都正确选取。请提供一些建议。我在我的资源中传递了hbase-site.xml,core-site.xml,hdfs-site.xml。我的港口和动物园管理员qurom是正确的。我的kerberose代码工作正常。
代码:连接返回null :-(
this.conf = HBaseConfiguration.create();
this.conf.set("hbase.zookeeper.quorum", zookeeperQuorum);
this.conf.set("hbase.zookeeper.property.clientPort", port);
this.conf.set("zookeeper.znode.parent", "/hbase-secure");
//this.conf.set("hbase.client.retries.number", Integer.toString(35));
//this.conf.set("zookeeper.session.timeout", Integer.toString(20000));
//this.conf.set("zookeeper.recovery.retry", Integer.toString(1));
this.conf.set("hadoop.security.authentication", "kerberos");
this.conf.set("hbase.security.authentication", "kerberos");
this.conf.set("hbase.master.kerberos.principal", userName);
this.conf.set("user.name", userName);
try {
this.connection = HConnectionManager.createConnection(conf);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org /xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.msoa.hbase.client</groupId>
<artifactId>simpleHBase</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>HbaseWrite</name>
<url>http://maven.apache.org</url>
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>simpleHBase.actionClass</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
</plugins>
</build>
<!-- added for dev box -->
<repositories>
<repository>
<id>repo.hortonworks.com</id>
<name>Hortonworks HDP Maven Repository</name>
<url>http://repo.hortonworks.com/content/repositories/releases/</url>
</repository>
</repositories>
<!-- end dev box -->
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>jdk.tools</groupId>
<artifactId>jdk.tools</artifactId>
<scope>system</scope>
<version>1.7.0_60</version>
<systemPath>C:\Program Files\Java\jdk1.7.0_60\lib\tools.jar</systemPath>
</dependency>
<!-- adding to test on beam -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.2.0</version>
</dependency>
<!-- add protocol for beam test-->
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-protocol</artifactId>
<version>0.98.0-hadoop2</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>0.98.0-hadoop2</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-common</artifactId>
<version>0.98.0-hadoop2</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-protocol</artifactId>
<version>0.98.0-hadoop2</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-server</artifactId>
<version>0.98.0-hadoop2</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>4.2.3.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>4.2.3.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>4.2.3.RELEASE</version>
</dependency>
</dependencies>
错误:
java.io.IOException:java.lang.reflect.InvocationTargetException at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:416) at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:309) 在simpleHBase.HBaseConnectionFactory。(HBaseConnectionFactory.java:99) 在simpleHBase.HBaseClient。(HBaseClient.java:26) at simpleHBase.actionClass.main(actionClass.java:118)引起:java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:408) at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:414) ... 4更多引起:java.lang.ExceptionInInitializerError at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64) 在org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:69) 在org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83) at org.apache.hadoop.hbase.client.HConnectionManager $ HConnectionImplementation.retrieveClusterId(HConnectionManager.java:857) at org.apache.hadoop.hbase.client.HConnectionManager $ HConnectionImplementation。(HConnectionManager.java:662) ... 9更多引起:java.lang.RuntimeException:无法创建本地dir / data0 / hadoop / hbase / local / jars,DynamicClassLoader 无法初始化 在org.apache.hadoop.hbase.util.DynamicClassLoader。(DynamicClassLoader.java:94) 在org.apache.hadoop.hbase.protobuf.ProtobufUtil。(ProtobufUtil.java:201) ......还有14个
答案 0 :(得分:1)
在例外结束时,有一行,如
&#34;引起:java.lang.RuntimeException:无法创建本地dir / data0 / hadoop / hbase / local / jars,&#34;
是否可以检查权限,用户是否有权在指定位置创建目录。