Spark HBaseTest无法正常工作

时间:2016-01-19 09:24:24

标签: apache-spark hbase

尝试使用kerberos auth让HBaseTest在群集模式下使用火花设置工作 -

$SPARK_HOME/bin/spark-submit \
--master yarn-cluster \
--class org.apache.spark.examples.HBaseTest \
--jars /apache/hbase/lib/*.jars \
--driver-class-path /apache/hbase/conf \
$SPARK_HOME/lib/spark-examples.jar \
myhbasetable

我最终得到了 -

User class threw exception: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=35, exceptions:
Tue Jan 19 01:00:22 GMT-07:00 2016, org.apache.hadoop.hbase.client.RpcRetryingCaller@6515a0f9, java.io.IOException: Failed to find location, tableName=hbase:meta, row=myhbasetable,,00000000000000, reload=false
Tue Jan 19 01:00:50 GMT-07:00 2016, org.apache.hadoop.hbase.client.RpcRetryingCaller@6515a0f9, java.io.IOException: Enable/Disable failed
Tue Jan 19 01:01:16 GMT-07:00 2016, org.apache.hadoop.hbase.client.RpcRetryingCaller@6515a0f9, java.io.IOException: Enable/Disable failed
Tue Jan 19 01:01:43 GMT-07:00 2016, org.apache.hadoop.hbase.client.RpcRetryingCaller@6515a0f9, java.io.IOException: Enable/Disable failed
Tue Jan 19 01:02:09 GMT-07:00 2016, org.apache.hadoop.hbase.client.RpcRetryingCaller@6515a0f9, java.io.IOException: Enable/Disable failed
Tue Jan 19 01:02:37 GMT-07:00 2016, org.apache.hadoop.hbase.client.RpcRetryingCaller@6515a0f9, java.io.IOException: Enable/Disable failed

编辑:

我在hbase-site.xml中没有主ip,我怎么能找到它?我有ips列表。我能够使用spark-shell运行但不能使用spark-submit ..我得到这个错误,我觉得这与某些身份验证有关 -

16/01/19 18:45:39 ERROR yarn.ApplicationMaster: User class threw exception: org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.io.IOException: Call to 10.115.201.64:60000 failed on local exception: java.io.IOException: Call id=4, waitTime=38 
org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.io.IOException: Call to 10.115.201.64:60000 failed on local exception: java.io.IOException: Call id=4, waitTime=38 
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1607) 
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
at ...

有什么建议吗?

1 个答案:

答案 0 :(得分:0)

您可以分享有关如何尝试连接的更多信息吗?我在这里附上一个希望有用的例子:

val hbaseOutConf = HBaseConfiguration.create()
hbaseOutConf.set("hbase.zookeeper.quorum", "list of ip's")
hbaseOutConf.set("hbase.zookeeper"+ ".property.clientPort","2181");
hbaseOutConf.set("hbase.master", "masterIP:60000");
hbaseOutConf.set("hadoop.security.authentication", "kerberos");
hbaseOutConf.set("hbase.security.authentication", "kerberos");

使用UserGRoupInformation从keytab登录

UserGroupInformation.setConfiguration(hbaseOutConf);
UserGroupInformation.loginUserFromKeytab("user@---", keyTabPath);

HBaseAdmin.checkHBaseAvailable(hbaseOutConf);

如果您在Cloudera环境中,请检查以下项目:

http://blog.cloudera.com/blog/2015/08/apache-spark-comes-to-apache-hbase-with-hbase-spark-module/