委托令牌只能通过kerberos或Web身份验证Spark action Oozie发布

时间:2017-11-24 14:01:02

标签: hadoop apache-spark yarn kerberos oozie

我有一个集群(hadoop-2.7.3),hbase(1.2),zookeeper(zookeeper-3.4.8),phoenix(apache-phoenix-4.10.0),spark(2.2.0)和oozie( 4.3.0)。所有组件都配置了kerberos(火花少)。

当我尝试将Oozie的spark动作部署到kerberized hadoop集群时,在yarn日志(stdout)中我有这样的错误:

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, Delegation Token can be issued only with kerberos or web authentication
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getDelegationToken(FSNamesystem.java:6642)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getDelegationToken(NameNodeRpcServer.java:564)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getDelegationToken(ClientNamenodeProtocolServerSideTranslatorPB.java:987)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

我的工作流程如下:

<credentials>
         <credential name='hb' type='hbase'>
                <property>
                        <name>hadoop.security.authentication</name>
                        <value>kerberos</value>
                </property>
                <property>
                        <name>hbase.security.authentication</name>
                        <value>kerberos</value>
                </property>
                <property>
                        <name>hbase.master.kerberos.principal</name>
                        <value>hbase/host@REALM</value>
                </property>
                <property>
                        <name>hbase.regionserver.kerberos.principal</name>
                        <value>hbase/host@REALM</value>
                </property>
                <property>
                        <name>hbase.zookeeper.quorum</name>
                        <value>domain</value>
                </property>
                <property>
                        <name>hadoop.rpc.protection</name>
                        <value>authentication</value>
                </property>
                <property>
                        <name>hbase.rpc.protection</name>
                        <value>authentication</value>
                </property>
                <property>
                        <name>hbase.master.keytab.file</name>
                        <value>/etc/security/keytabs/hbase.keytab</value>
                </property>
                <property>
                        <name>hbase.regionserver.keytab.file</name>
                        <value>/etc/security/keytabs/hbase.keytab</value>
                </property>
         </credential>
        </credentials>

<action name="action_name" cred="hb">
    <spark xmlns="uri:oozie:spark-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <master>yarn</master>
            <mode>cluster</mode>
            <name>spark action name</name>
            <class>com.sparkclass</class>
            <jar>spark_code.jar</jar>
            <spark-opts>--queue jobs --file hbase-site.xml</spark-opts>
            <arg>-D</arg>
            <arg>hbase.zookeeper.quorum=${zkQuorum}</arg>
            <arg>-D</arg>
            <arg>hbase.zookeeper.client.port=${zkClientPort}</arg>
            <arg>${date}</arg>
            <arg>${arg2}</arg>
            <arg>${arg3}</arg>
            <file>${nameNode}/user/${wf:user()}/jobs/lib/hbase-site.xml</file>
    </spark>
    <ok to="end"/>
    <error to="fail"/>
</action>

我有一个java动作,sqoop动作并且工作完美,只有在火花动作中,我才有这个问题。

我尝试更改java操作和shell操作whitout结果。也许我需要更改我的代码?

由于

0 个答案:

没有答案