不能对Kafka Connect和HDFS Sink Connector使用不同的KDC和领域吗?

时间:2019-03-22 01:04:02

标签: apache-kafka hdfs kerberos apache-kafka-connect confluent

我已经用kerberized Kafka Cluster设置了我的Kafka Connect。(说KDC 是“ kafka-auth101.hadoop.local”和领域“ KAFKA.MYCOMPANY.COM”)

现在,我正在尝试将HDFS Sink设置为使用不同的KDC写入以Kerberized方式存储的Hadoop集群(例如,KDC为“ hadoop-auth101.hadoop.local”,而“领域” HADOOP.MYCOMPANY.COM”

我已经将这两个领域都添加到Kafka Connect使用的krb5.conf中。

但是在初始化期间,HDFS Sink连接器实例无法给出错误

关于此的任何提示吗? 基本上,通过这种配置,一个JVM尝试使用不同的KDC和领域。

>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): HADOOP.MYCOMPANY.COM
>>> KeyTabInputStream, readName(): hdfsuser
>>> KeyTab: load() entry length: 85; type: 18
Looking for keys for: hdfsuser@HADOOP.MYCOMPANY.COM
Found unsupported keytype (18) for hdfsuser@HADOOP.MYCOMPANY.COM
[2019-03-19 07:21:12,330] INFO Couldn't start HdfsSinkConnector:
(io.confluent.connect.hdfs.HdfsSinkTask)
org.apache.kafka.connect.errors.ConnectException: java.io.IOException:
Login failure for hdfsuser@HADOOP.MYCOMPANY.COM from keytab
/etc/hadoop/keytab/stg.keytab: javax.security.auth.login.LoginException:
Unable to obtain password from user

at io.confluent.connect.hdfs.DataWriter.<init>(DataWriter.java:202)
at io.confluent.connect.hdfs.HdfsSinkTask.start(HdfsSinkTask.java:76)
at
org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:232)
at
org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:145)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Login failure for
hdfsuser@HADOOP.MYCOMPANY.COM from keytab /etc/hadoop/keytab/stg.keytab:
javax.security.auth.login.LoginException: Unable to obtain password from
user

at
org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:963)
at io.confluent.connect.hdfs.DataWriter.<init>(DataWriter.java:127)
... 10 more
Caused by: javax.security.auth.login.LoginException: Unable to obtain
password from user

krb5.conf 看起来像这样:

[logging]
   kdc = FILE:/var/log/krb5/krb5kdc.log
   admin_server = FILE:/var/log/krb5/kadmin.log
   default = FILE:/var/log/krb5/krb5libs.log

[libdefaults]
 default_realm = KAFKA.MYCOMPANY.COM
 dns_lookup_realm = false
 dns_lookup_kdc = false
 ticket_lifetime = 24h
 forwardable = yes
 allow_weak_crypto = true
 renew_lifetime = 7d
 kdc_timeout = 3000
 max_retries = 2
 clockskew = 120
 default_tkt_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc
 default_tgs_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc
 permitted_enctypes   = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc



[realms]
 # KDC,Realm for Kafka
 KAFKA.MYCOMPANY.COM = {
  kdc = kafka-auth101.hadoop.local
  admin_server = kafka-auth101.hadoop.local:2749
 }

 # KDC,Realm for Hadoop/HDFS
 HADOOP.MYCOMPANY.COM = {
  kdc = hadoop-auth101.hadoop.local
  admin_server = hadoop-auth101.hadoop.local:2749
 }

[appdefaults]
 pam = {
   debug = false
   ticket_lifetime = 36000
   renew_lifetime = 36000
   forwardable = true
   krb4_convert = false
 }

0 个答案:

没有答案