Java客户端中的HDFS INotify和Kerberos身份验证

时间:2018-06-06 15:05:09

标签: java hadoop hdfs kerberos inotify

我在Internet上使用此示例:hdfs-inotify-example,构建完成但没有错误,但执行以错误结束:

Exception in thread "main" org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]

Kerberos系统正在运行,我有一张新的Kerberos票,效果很好。所以我不太确定这是关于Kerberos的问题。 我也设置了这个env var:

export HADOOP_CONF_DIR=/etc/hadoop/conf  

指向core-site.xml,其中安全设置AFAIK是正确的:

  <property>
    <name>hadoop.security.authentication</name>
  <value>kerberos</value>
  </property>
  <property>
     <name>hadoop.security.authorization</name>
  <value>true</value>
  </property>
  <property>
     <name>hadoop.rpc.protection</name>
  <value>authentication</value>
 </property>

出了什么问题?每个建议都很受欢迎(很多)。

我正在使用 Hadoop 2.6.0-cdh5.10.1

2 个答案:

答案 0 :(得分:0)

最后,我找到了解决问题的方法:

这是我添加到hdfs-inotify-example

的补丁
diff --git a/src/main/java/com/onefoursix/HdfsINotifyExample.java b/src/main/java/com/onefoursix/HdfsINotifyExample.java
index 97ac409..32321b1 100644
--- a/src/main/java/com/onefoursix/HdfsINotifyExample.java
+++ b/src/main/java/com/onefoursix/HdfsINotifyExample.java
@@ -11,6 +11,7 @@ import org.apache.hadoop.hdfs.inotify.Event.CreateEvent;
 import org.apache.hadoop.hdfs.inotify.Event.UnlinkEvent;
 import org.apache.hadoop.hdfs.inotify.EventBatch;
 import org.apache.hadoop.hdfs.inotify.MissingEventsException;
+import org.apache.hadoop.security.UserGroupInformation;

 public class HdfsINotifyExample {

@@ -21,10 +22,20 @@ public class HdfsINotifyExample {
                if (args.length > 1) {
                        lastReadTxid = Long.parseLong(args[1]);
                }
-
+        
+                System.out.println("com.onefoursix.HdfsINotifyExample.main()");
                System.out.println("lastReadTxid = " + lastReadTxid);
-
-               HdfsAdmin admin = new HdfsAdmin(URI.create(args[0]), new Configuration());
+                Configuration config = new Configuration();
+                
+                config.set("hadoop.security.authentication", "kerberos");
+                config.set("hadoop.security.authorization", "true");
+                config.set("dfs.namenode.kerberos.principal", "hdfs/_HOST@AD.XXXXX.COM");
+                config.set("dfs.namenode.kerberos.principal.pattern", "hdfs/*@AD.XXXXX.COM");
+                
+                UserGroupInformation.setConfiguration(config);
+                System.out.println("Security enabled " + UserGroupInformation.isSecurityEnabled());
+                
+               HdfsAdmin admin = new HdfsAdmin(URI.create(args[0]), config);

                DFSInotifyEventInputStream eventStream = admin.getInotifyEventStream(lastReadTxid);

身份验证工作正常。最后我得到了:

Exception in thread "main" org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Access denied for user xxxxxxxx. Superuser privilege is required

告诉我当前用户无法查看Datanode中发生的情况,但这是另一个历史记录。

答案 1 :(得分:-1)

添加到ozw1z5rd的答案:

请登录到超级用户(hdfs)并在那里执行程序。

$ sudo -i -u hdfs
$ cp shaded-fat-jar.jar /home/hdfs

并从复制到hdfs主页的jar文件中运行程序。