我想连接到HBase以从中重新提供数据。 但我面临一个错误 我写的代码如下:
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.client.HTable;
import org.apache.hadoop.fs.Path;
import com.google.protobuf.ServiceException;
public class test {
public static void main(String[] args) throws ServiceException, IOException
{
System.out.println("Trying to connect...");
Configuration configuration = HBaseConfiguration.create();
configuration.addResource(new Path("hbase-site.xml"));
System.out.println("HBase is running!");
// creating a new table
HTable table = new HTable(configuration, "customerLocations");
System.out.println("Table obtained ..");
// Manipulating table ...
}
}
我复制te" hbase-site.xml"从服务器并将其添加到类路径。 这是此文件的内容:" core-site.xml"
<property>
<name>dfs.domain.socket.path</name>
<value>/var/lib/hadoop-hdfs/dn_socket</value>
</property>
<property>
<name>hbase.bulkload.staging.dir</name>
<value>/apps/hbase/staging</value>
</property>
<property>
<name>hbase.client.keyvalue.maxsize</name>
<value>1048576</value>
</property>
<property>
<name>hbase.client.retries.number</name>
<value>35</value>
</property>
<property>
<name>hbase.client.scanner.caching</name>
<value>100</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
<property>
<name>hbase.coprocessor.master.classes</name>
<value></value>
</property>
<property>
<name>hbase.coprocessor.region.classes</name>
<value>org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint</value>
</property>
<property>
<name>hbase.defaults.for.version.skip</name>
<value>true</value>
</property>
<property>
<name>hbase.hregion.majorcompaction</name>
<value>604800000</value>
</property>
<property>
<name>hbase.hregion.majorcompaction.jitter</name>
<value>0.50</value>
</property>
<property>
<name>hbase.hregion.max.filesize</name>
<value>10737418240</value>
</property>
<property>
<name>hbase.hregion.memstore.block.multiplier</name>
<value>4</value>
</property>
<property>
<name>hbase.hregion.memstore.flush.size</name>
<value>134217728</value>
</property>
<property>
<name>hbase.hregion.memstore.mslab.enabled</name>
<value>true</value>
</property>
<property>
<name>hbase.hstore.blockingStoreFiles</name>
<value>10</value>
</property>
<property>
<name>hbase.hstore.compaction.max</name>
<value>10</value>
</property>
<property>
<name>hbase.hstore.compactionThreshold</name>
<value>3</value>
</property>
<property>
<name>hbase.local.dir</name>
<value>${hbase.tmp.dir}/local</value>
</property>
<property>
<name>hbase.master.info.bindAddress</name>
<value>0.0.0.0</value>
</property>
<property>
<name>hbase.master.info.port</name>
<value>16010</value>
</property>
<property>
<name>hbase.master.namespace.init.timeout</name>
<value>2400000</value>
</property>
<property>
<name>hbase.master.port</name>
<value>16000</value>
</property>
<property>
<name>hbase.master.ui.readonly</name>
<value>false</value>
</property>
<property>
<name>hbase.master.wait.on.regionservers.timeout</name>
<value>30000</value>
</property>
<property>
<name>hbase.regionserver.executor.openregion.threads</name>
<value>20</value>
</property>
<property>
<name>hbase.regionserver.global.memstore.size</name>
<value>0.4</value>
</property>
<property>
<name>hbase.regionserver.handler.count</name>
<value>30</value>
</property>
<property>
<name>hbase.regionserver.info.port</name>
<value>16030</value>
</property>
<property>
<name>hbase.regionserver.port</name>
<value>16020</value>
</property>
<property>
<name>hbase.regionserver.wal.codec</name>
<value>org.apache.hadoop.hbase.regionserver.wal.WALCellCodec</value>
</property>
<property>
<name>hbase.rootdir</name>
<value>hdfs://189.168.12.6:8020/apps/hbase/data</value>
</property>
<property>
<name>hbase.rpc.protection</name>
<value>authentication</value>
</property>
<property>
<name>hbase.rpc.timeout</name>
<value>90000</value>
</property>
<property>
<name>hbase.security.authentication</name>
<value>simple</value>
</property>
<property>
<name>hbase.security.authorization</name>
<value>false</value>
</property>
<property>
<name>hbase.superuser</name>
<value>hbase</value>
</property>
<property>
<name>hbase.tmp.dir</name>
<value>/tmp/hbase-${user.name}</value>
</property>
<property>
<name>hbase.zookeeper.property.clientPort</name>
<value>2181</value>
</property>
<property>
<name>hbase.zookeeper.quorum</name>
<value>189.168.12.6</value>
</property>
<property>
<name>hbase.zookeeper.useMulti</name>
<value>true</value>
</property>
<property>
<name>hfile.block.cache.size</name>
<value>0.4</value>
</property>
<property>
<name>phoenix.query.timeoutMs</name>
<value>600</value>
</property>
<property>
<name>zookeeper.recovery.retry</name>
<value>6</value>
</property>
<property>
<name>zookeeper.session.timeout</name>
<value>90000</value>
</property>
<property>
<name>zookeeper.znode.parent</name>
<value>/hbase-unsecure</value>
</property>
但它看起来服务器没有响应,它在行
中被阻止HTable table = new HTable(configuration, "customerLocations");
并且它没有执行下一行。 这是执行代码时在终端中显示的内容:
Trying to connect...
WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties files.
HBase is running!
17/08/10 09:17:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:host.name=vds004.insightscale.tn
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:java.version=1.8.0_141
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.141-1.b16.el7_3.x86_64/jre
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:java.class.path=jar_files/activation-1.1.jar:jar_files/apacheds-i18n-2.0.0-M15.jar:jar_files/apacheds-kerberos-codec-2.0.0-M15.jar:jar_files/api-asn1-api-1.0.0-M20.jar:jar_files/api-util-1.0.0-M20.jar:jar_files/avro-1.7.4.jar:jar_files/commons-beanutils-1.7.0.jar:jar_files/commons-beanutils-core-1.8.0.jar:jar_files/commons-cli-1.2.jar:jar_files/commons-codec-1.4.jar:jar_files/commons-collections-3.2.2.jar:jar_files/commons-compress-1.4.1.jar:jar_files/commons-configuration-1.6.jar:jar_files/commons-digester-1.8.jar:jar_files/commons-httpclient-3.1.jar:jar_files/commons-io-2.4.jar:jar_files/commons-lang-2.6.jar:jar_files/commons-logging-1.1.3.jar:jar_files/commons-math3-3.1.1.jar:jar_files/commons-net-3.1.jar:jar_files/curator-client-2.7.1.jar:jar_files/curator-framework-2.7.1.jar:jar_files/curator-recipes-2.7.1.jar:jar_files/gson-2.2.4.jar:jar_files/guava-11.0.2.jar:jar_files/hadoop-annotations-2.7.3.jar:jar_files/hadoop-auth-2.7.3.jar:jar_files/hadoop-client-2.7.3.jar:jar_files/hadoop-common-2.7.3.jar:jar_files/hadoop-hdfs-2.7.3.jar:jar_files/hadoop-mapreduce-client-app-2.7.3.jar:jar_files/hadoop-mapreduce-client-common-2.7.3.jar:jar_files/hadoop-mapreduce-client-core-2.7.3.jar:jar_files/hadoop-mapreduce-client-jobclient-2.7.3.jar:jar_files/hadoop-mapreduce-client-shuffle-2.7.3.jar:jar_files/hadoop-yarn-api-2.7.3.jar:jar_files/hadoop-yarn-client-2.7.3.jar:jar_files/hadoop-yarn-common-2.7.3.jar:jar_files/hadoop-yarn-server-common-2.7.3.jar:jar_files/htrace-core-3.1.0-incubating.jar:jar_files/httpclient-4.2.5.jar:jar_files/httpcore-4.2.4.jar:jar_files/jackson-core-asl-1.9.13.jar:jar_files/jackson-jaxrs-1.9.13.jar:jar_files/jackson-mapper-asl-1.9.13.jar:jar_files/jackson-xc-1.9.13.jar:jar_files/jaxb-api-2.2.2.jar:jar_files/jersey-client-1.9.jar:jar_files/jersey-core-1.9.jar:jar_files/jetty-util-6.1.26.jar:jar_files/jsp-api-2.1.jar:jar_files/jsr305-3.0.0.jar:jar_files/leveldbjni-all-1.8.jar:jar_files/log4j-1.2.17.jar:jar_files/netty-3.6.2.Final.jar:jar_files/netty-all-4.0.23.Final.jar:jar_files/paranamer-2.3.jar:jar_files/protobuf-java-2.5.0.jar:jar_files/servlet-api-2.5.jar:jar_files/slf4j-api-1.7.10.jar:jar_files/slf4j-log4j12-1.7.10.jar:jar_files/snappy-java-1.0.4.1.jar:jar_files/stax-api-1.0-2.jar:jar_files/xercesImpl-2.9.1.jar:jar_files/xml-apis-1.3.04.jar:jar_files/xmlenc-0.52.jar:jar_files/xz-1.0.jar:jar_files/zookeeper-3.4.6.jar:apache-logging-log4j.jar:hadoop-0.20.1-dev-core.jar:hbase-0.92.1.jar:org-apache-commons-logging.jar:zookeeper.jar:
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:os.version=3.10.0-514.26.2.el7.x86_64
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:user.name=boussama
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/boussama
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/boussama/Desktop/hbase2/test
17/08/10 09:17:51 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=189.168.12.6:2181 sessionTimeout=90000 watcher=hconnection
17/08/10 09:17:51 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 5866@vds004.insightscale.tn
17/08/10 09:17:51 INFO zookeeper.ClientCnxn: Opening socket connection to server 189.168.12.6/189.168.12.6:2181. Will not attempt to authenticate using SASL (unknown error)
17/08/10 09:17:51 INFO zookeeper.ClientCnxn: Socket connection established to 189.168.12.6/189.168.12.6:2181, initiating session
17/08/10 09:17:51 INFO zookeeper.ClientCnxn: Session establishment complete on server 189.168.12.6/189.168.12.6:2181, sessionid = 0x15db4c4c7013188, negotiated timeout = 60000
该程序在那里被阻止,并没有退出或抛出任何异常。 我不知道问题的根源,因为它对我来说没有任何例外。