使用Java从SPARK向HBase表写入数据时出现安全问题

时间:2017-02-24 14:40:15

标签: apache-spark hbase spark-streaming kerberos

JavaDStream<FreshInput> inputDStream = kafkaInputDStream
.map(new Function<Tuple2<String, FreshInput>, FreshInput>() {

    private static final long serialVersionUID = 1L;

    @Override
    public FreshInput call(Tuple2<String, FreshInput> tuple) throws Exception {

    if(tuple._2!=null)
        {
            FreshLogger.errorLog_and_Console("***************************************************");    
            FreshLogger.errorLog_and_Console("INSIDE FRESH INPUT CALL");
            FreshLogger.errorLog_and_Console(tuple._1.toString());
            FreshLogger.errorLog_and_Console(tuple._2.toString());
            FreshLogger.errorLog_and_Console("***************************************************");

            Configuration hconf=HBaseConfiguration.create();

            hconf.addResource(new Path("/etc/hbase/conf/core-site.xml"));
            hconf.addResource(new Path("/etc/hbase/conf/hbase-site.xml"));

            UserGroupInformation.setConfiguration(hconf);

            FreshLogger.errorLog_and_Console("hconf is "+hconf.toString());

            //UserGroupInformation.loginUserFromKeytabAndReturnUGI("gffshnee", "/etc/krb5.keytab");

            Connection connection = ConnectionFactory.createConnection(hconf);  

            Table table = connection.getTable(TableName.valueOf("gfttsdgn:FRESHHBaseRushi"));

            Put p = new Put(Bytes.toBytes("row1"));

            p.add(Bytes.toBytes("c1"), Bytes.toBytes("output"), Bytes.toBytes("rushi"));
            table.put(p);       

        }
    return tuple._2();  
    }
});

获得以下例外:

2017-02-24 09:09:33 WARN -FRESH Executor task launch worker-0 -UserGroupInformation : PriviledgedActionException as:gffshnee (auth:SIMPLE) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
2017-02-24 09:09:33 WARN -FRESH Executor task launch worker-0 -RpcClientImpl$Connection$1 : Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
2017-02-24 09:09:33 ERROR-FRESH Executor task launch worker-0 -RpcClientImpl$Connection$1 : SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
    at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:181)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1783)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1242)
    at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
    at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
    at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.java:34070)
    at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1589)
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1398)
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1199)
    at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:395)
    at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:344)
    at org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:238)
    at org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:190)
    at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1495)
    at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1086)
    at com.citi.fresh.freshcore.util.ProcessRDD$1.call(ProcessRDD.java:198)
    at com.citi.fresh.freshcore.util.ProcessRDD$1.call(ProcessRDD.java:160)
    at org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:1015)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
    at scala.collection.convert.Wrappers$IteratorWrapper.next(Wrappers.scala:30)
    at com.citi.fresh.freshcore.util.ProcessRDD$3$1.call(ProcessRDD.java:361)
    at com.citi.fresh.freshcore.util.ProcessRDD$3$1.call(ProcessRDD.java:352)
    at org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1.apply(JavaRDDLike.scala:225)
    at org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1.apply(JavaRDDLike.scala:225)
    at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
    at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:89)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
    at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
    at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
    at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
    at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)

请建议。

0 个答案:

没有答案