使用kerberos访问Hadoop失败

时间:2016-09-13 06:34:53

标签: hadoop kerberos

我安装了hadoop和kerberos,但是当我执行hadoop fs -ls /错误时已经发生了错误。

[dannil@ozcluster06 logs]$ hadoop fs -ls /
16/09/13 11:34:39 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate    failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "localhost/127.0.0.1"; destination host is: "192.168.168.46":9000; 

我可以看到datanode和namenode已经由jps

启动
  20963 DataNode
  21413 SecondaryNameNode
  20474 NameNode
  22906 Jps

我添加了主体hdfs/oz.flex@OZ.FLEXHTTP/oz.flex@OZ.FLEX,然后我使用xst -norandkey -k hdfs.keytab hdfs/oz.flex@OZ.FLEX HTTP/oz.flex@OZ.FLEX来生成hdfs.keytab

kadmin.local:  listprincs
HTTP/oz.flex@OZ.FLEX
K/M@OZ.FLEX
dannil/admin@OZ.FLEX
hdfs/oz.flex@OZ.FLEX
kadmin/admin@OZ.FLEX
kadmin/changepw@OZ.FLEX
kadmin/ozcluster06@OZ.FLEX
kiprop/ozcluster06@OZ.FLEX
krbtgt/OZ.FLEX@OZ.FLEX

然后我执行kinit -kt /home/dannil/hadoop-2.7.1/hdfs.keytab hdfs/oz.flex

我可以看到我的机票状态:

[dannil@ozcluster06 ~]$ klist
Ticket cache: KEYRING:persistent:1000:krb_ccache_4h73plA
Default principal: hdfs/oz.flex@OZ.FLEX

Valid starting       Expires              Service principal
2016-09-13T10:47:06  2016-09-14T10:47:06  krbtgt/OZ.FLEX@OZ.FLEX

这是我的hadoop配置值:

核心-site.xml中:

fs.defaultFS=hdfs://192.168.168.46:9000
hadoop.security.authentication=kerberos
hadoop.security.authorization=true

HDFS-site.xml中:

dfs.replication=1
dfs.permissions=false
dfs.block.access.token.enable=true
dfs.namenode.keytab.file=/home/dannil/hadoop-2.7.1/hdfs.keytab
dfs.namenode.kerberos.principal=hdfs/oz.flex@OZ.FLEX
dfs.namenode.kerberos.internal.spnego.principal=HTTP/oz.flex@OZ.FLEX
dfs.secondary.namenode.keytab.file=/home/dannil/hadoop-2.7.1/hdfs.keytab
dfs.secondary.namenode.kerberos.principal=hdfs/oz.flex@OZ.FLEX
dfs.secondary.namenode.kerberos.internal.spnego.principal=HTTP/oz.flex@OZ.FLEX
dfs.datanode.data.dir.perm=700
dfs.datanode.address=0.0.0.0:61004
dfs.datanode.http.address=0.0.0.0:61006
dfs.datanode.keytab.file=/home/dannil/hadoop-2.7.1/hdfs.keytab
dfs.datanode.kerberos.principal=hdfs/oz.flex@OZ.FLEX
dfs.https.port=50470
dfs.https.address=0.0.0.0:50470
dfs.webhdfs.enabled=true
dfs.web.authentication.kerberos.principal=HTTP/oz.flex@OZ.FLEX
dfs.web.authentication.kerberos.keytab=/home/dannil/hadoop-2.7.1/hdfs.keytab
dfs.http.policy=HTTPS_ONLY
dfs.data.transfer.protection=integrity

错误怎么发生?以及我该怎么做才能解决问题?

1 个答案:

答案 0 :(得分:0)

  1. 尝试以'hdfs'用户身份执行命令并检查
  2. 如果这不起作用,那么请您确认是否在ssh配置文件中启用了Kerberos'/ etc / ssh / sshd_config',检查“KerberosAuthentication Yes”行以及“GSSAPIAuthentication Yes”行。默认情况下,它会被注释掉,因此取消注释并将其更改为是。
  3. 请在这里告诉我们,是否成功。这对其他人有帮助。