Ozerie Spark行动因kerberos环境而失败

时间:2018-03-18 08:03:39

标签: apache-spark hive kerberos oozie hive-metastore

我正在通过oozie spark动作开展一项火花工作。 spark作业使用hivecontext来执行一些要求。群集配置有kerberos。当我使用spark-submit表单控制台提交作业时,它已成功运行。但是当我从oozie运行这个工作时,最终会出现以下错误。

18/03/18 03:34:16 INFO metastore: Trying to connect to metastore with URI thrift://localhost.local:9083
    18/03/18 03:34:16 ERROR TSaslTransport: SASL negotiation failure
    javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
            at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
            at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)

workflow.xml

<workflow-app xmlns="uri:oozie:workflow:0.5" name="workflow">
   <start to="analysis" />
   <!-- Bash script to do the spark-submit. The version numbers of these actions are magic. -->
   <action name="Analysis">
      <spark xmlns="uri:oozie:spark-action:0.1">
         <job-tracker>${jobTracker}</job-tracker>
         <name-node>${nameNode}</name-node>
         <master>${master}</master>
         <name>Analysis</name>
         <class>com.demo.analyzer</class>
         <jar>${appLib}</jar>
         <spark-opts>--jars ${sparkLib} --files ${config},${hivesite} --num-executors ${NoOfExecutors} --executor-cores ${ExecutorCores} --executor-memory ${ExecutorMemory} --driver-memory ${driverMemory}</spark-opts>
      </spark>
      <ok to="sendEmail" />
      <error to="fail" />
   </action>
   <action name="sendEmail">
      <email xmlns="uri:oozie:email-action:0.1">
         <to>${emailToAddress}</to>
         <subject>Output of workflow ${wf:id()}</subject>
         <body>Results from line count: ${wf:actionData('shellAction')['NumberOfLines']}</body>
      </email>
      <ok to="end" />
      <error to="end" />
   </action>
   <!-- You wish you'd ever get Oozie errors. -->
   <kill name="fail">
      <message>Bash action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
   </kill>
   <end name="end" />
</workflow-app>

我是否需要在workflow.xml中配置与Kerberos相关的任何内容?我在这里遗漏了什么。

任何帮助表示感谢。

提前致谢。

2 个答案:

答案 0 :(得分:4)

您需要在oozie工作流程中为thrift uri添加hcat凭据。这将使用Kerberos成功验证thrift URI的Metastore。

在oozie工作流程中的凭据标记下方添加。

<credentials>
    <credential name="credhive" type="hcat">
        <property>
            <name>hcat.metastore.uri</name>
            <value>${thrift_uri}</value>
        </property>
        <property>
            <name>hcat.metastore.principal</name>
            <value>${principal}</value>
        </property>
    </credential>
</credentials>

并为spark操作提供凭据,如下所示:

<action name="Analysis" cred="credhive">
      <spark xmlns="uri:oozie:spark-action:0.1">
         <job-tracker>${jobTracker}</job-tracker>
         <name-node>${nameNode}</name-node>
         <master>${master}</master>
         <name>Analysis</name>
         <class>com.demo.analyzer</class>
         <jar>${appLib}</jar>
         <spark-opts>--jars ${sparkLib} --files ${config},${hivesite} --num-executors ${NoOfExecutors} --executor-cores ${ExecutorCores} --executor-memory ${ExecutorMemory} --driver-memory ${driverMemory}</spark-opts>
      </spark>
      <ok to="sendEmail" />
      <error to="fail" />
   </action>

可以在thrift_uri中找到principalhive-site.xml。 thrift_uri将在hive-site.xml属性中设置:

<property>
    <name>hive.metastore.uris</name>
    <value>thrift://xxxxxx:9083</value>
  </property>

此外,将在hive-site.xml属性中设置principal:

 <property>
    <name>hive.metastore.kerberos.principal</name>
    <value>hive/_HOST@domain.COM</value>
  </property>

答案 1 :(得分:0)

在服务器中上传密钥表,然后将此密钥表文件和凭证作为参数添加到工作流程的spark-opts中。让我知道它是否有效。感谢。

<spark-opts>--keytab nagendra.keytab --principal "nagendra@domain.com"
 --jars ${sparkLib} --files ${config},${hivesite} --num-executors ${NoOfExecutors} --executor-cores ${ExecutorCores} --executor-memory
 ${ExecutorMemory} --driver-memory ${driverMemory}</spark-opts>