如何解决此Kerberos错误? “KrbException:KDC不支持加密类型(14) - BAD_ENCRYPTION_TYPE”

时间:2018-01-17 05:23:59

标签: hadoop hdfs kerberos

我在Kerberos安全的Hadoop集群上有一个非常奇怪的错误。当我直接从命令行(CLI)读取HDFS目录时,我得到了我想要的所有结果:

$ hadoop fs -ls hdfs://ip-172-31-0-38.us-west-2.compute.internal:8020/user/datapass
Found 1 items
-rw-r--r--   3 datapass datapass         11 2018-01-14 23:36 hdfs://ip-172-31-0-38.us-west-2.compute.internal:8020/user/datapass/test.txt

但是每当我尝试通过Apache Spark程序读取它时,我都会收到以下错误:

java.io.IOException: java.lang.reflect.UndeclaredThrowableException
    at org.apache.hadoop.crypto.key.kms.KMSClientProvider.addDelegationTokens(KMSClientProvider.java:888)
    at org.apache.hadoop.crypto.key.KeyProviderDelegationTokenExtension.addDelegationTokens(KeyProviderDelegationTokenExtension.java:86)
    at org.apache.hadoop.hdfs.DistributedFileSystem.addDelegationTokens(DistributedFileSystem.java:2234)
    at org.apache.spark.deploy.yarn.security.HadoopFSCredentialProvider$$anonfun$obtainCredentials$1.apply(HadoopFSCredentialProvider.scala:52)
    at org.apache.spark.deploy.yarn.security.HadoopFSCredentialProvider$$anonfun$obtainCredentials$1.apply(HadoopFSCredentialProvider.scala:49)
    at scala.collection.immutable.Set$Set1.foreach(Set.scala:94)
    at org.apache.spark.deploy.yarn.security.HadoopFSCredentialProvider.obtainCredentials(HadoopFSCredentialProvider.scala:49)
    at org.apache.spark.deploy.yarn.security.ConfigurableCredentialManager$$anonfun$obtainCredentials$2.apply(ConfigurableCredentialManager.scala:82)
    at org.apache.spark.deploy.yarn.security.ConfigurableCredentialManager$$anonfun$obtainCredentials$2.apply(ConfigurableCredentialManager.scala:80)
    at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at scala.collection.Iterator$class.foreach(Iterator.scala:893)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
    at scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206)
    at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
    at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
    at org.apache.spark.deploy.yarn.security.ConfigurableCredentialManager.obtainCredentials(ConfigurableCredentialManager.scala:80)
    at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:389)
    at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:832)
    at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:170)
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:173)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2516)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:918)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:910)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:910)
    at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
    at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:775)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.UndeclaredThrowableException
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1713)
    at org.apache.hadoop.crypto.key.kms.KMSClientProvider.addDelegationTokens(KMSClientProvider.java:870)
    ... 38 more
Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: GSSException: No valid credentials provided (Mechanism level: KDC has no support for encryption type (14) - BAD_ENCRYPTION_TYPE)
    at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:332)
    at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:205)
    at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.authenticate(DelegationTokenAuthenticator.java:131)
    at org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215)
    at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.doDelegationTokenOperation(DelegationTokenAuthenticator.java:288)
    at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.getDelegationToken(DelegationTokenAuthenticator.java:169)
    at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticatedURL.getDelegationToken(DelegationTokenAuthenticatedURL.java:373)
    at org.apache.hadoop.crypto.key.kms.KMSClientProvider$2.run(KMSClientProvider.java:875)
    at org.apache.hadoop.crypto.key.kms.KMSClientProvider$2.run(KMSClientProvider.java:870)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
    ... 39 more
Caused by: GSSException: No valid credentials provided (Mechanism level: KDC has no support for encryption type (14) - BAD_ENCRYPTION_TYPE)
    at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:770)
    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
    at org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:311)
    at org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:287)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:287)
    ... 50 more
Caused by: KrbException: KDC has no support for encryption type (14) - BAD_ENCRYPTION_TYPE
    at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
    at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251)
    at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:262)
    at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:308)
    at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:126)
    at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458)
    at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693)
    ... 57 more
Caused by: KrbException: Identifier doesn't match expected value (906)
    at sun.security.krb5.internal.KDCRep.init(KDCRep.java:140)
    at sun.security.krb5.internal.TGSRep.init(TGSRep.java:65)
    at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:60)
    at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55)
    ... 63 more

这很奇怪,因为

  1. 错误是UndeclaredThrowableException,而似乎没有涉及Java反射。
  2. 详细错误KDC has no support for encryption type (14) - BAD_ENCRYPTION_TYPE表示Kerberos .keytab使用未经批准的加密方法
  3. 我可以hadoop fs -ls使用相同的.keytab
  4. 这是我的krb5.conf文件:

    #File modified by ipa-client-install
    
    includedir /etc/krb5.conf.d/
    # includedir /var/lib/sss/pubconf/krb5.include.d/
    
    [libdefaults]
      default_realm = DATAPASSPORT.INTERNAL
      dns_lookup_realm = false
      dns_lookup_kdc = false
      rdns = false
      dns_canonicalize_hostname = false
      ticket_lifetime = 15m
      forwardable = true
      udp_preference_limit = 0
    
    renew_lifetime = 20m
    default_tgs_enctypes = aes256-cts aes128-cts
    default_tkt_enctypes = aes256-cts aes128-cts
    permitted_enctypes = aes256-cts aes128-cts
    
    
    [realms]
      DATAPASSPORT.INTERNAL = {
        kdc = ip-172-31-11-134.us-west-2.compute.internal:88
        master_kdc = ip-172-31-11-134.us-west-2.compute.internal:88
        admin_server = ip-172-31-11-134.us-west-2.compute.internal:749
        kpasswd_server = ip-172-31-11-134.us-west-2.compute.internal:464
        default_domain = datapassport.internal
    #    pkinit_anchors = FILE:/var/lib/ipa-client/pki/kdc-ca-bundle.pem
    #    pkinit_pool = FILE:/var/lib/ipa-client/pki/ca-bundle.pem
    
      }
    e
    
    [domain_realm]
      .datapassport.internal = DATAPASSPORT.INTERNAL
      datapassport.internal = DATAPASSPORT.INTERNAL
      ip-172-31-11-240.us-west-2.compute.internal = DATAPASSPORT.INTERNAL
      .us-west-2.compute.internal = DATAPASSPORT.INTERNAL
      us-west-2.compute.internal = DATAPASSPORT.INTERNAL
    

    配置与工作集群90%相同(与IPA相关的部分除外)。那么,为什么会发生这种情况,我应该怎么做才能解决它?

1 个答案:

答案 0 :(得分:0)

如果在所有节点上的JAVA_HOME/jre/lib/security文件夹下找不到Java Cryptography Enhancement(JCE)文件,并且所有服务都可能发生此问题,则可能会发生此问题。