我正在尝试使用远程Java客户端列出hfds文件夹内容,如下所示: -
public class ListHdsfFiles {
private static final String USER_NAME = "abcdefg";
public static void main(final String[] args) throws IOException, InterruptedException {
System.setProperty("HADOOP_USER_NAME", USER_NAME);
System.setProperty("hadoop.home.dir", "C:\\HADOOP\\hadoop-2.6.0");
final Configuration configuration = new Configuration();
configuration.set("hadoop.proxyuser.superuser.hosts", "*");
configuration.set("hadoop.security.authentication", "kerberos");
configuration.set("hadoop.security.authorization", "true");
configuration.set("hadoop.security.auth_to_local", "RULE:[1:$1@$0](.*@\\Q\\E$)s/@\\Q\\E$//" + "RULE:[2:$1@$0](.*@\\Q\\E$)s/@\\Q\\E$//" + "DEFAULT");
configuration.set("fs.defaultFS", "hdfs://host:9000");
UserGroupInformation.setConfiguration(configuration);
final UserGroupInformation userGroupInformation = UserGroupInformation.createProxyUser(USER_NAME, UserGroupInformation.getCurrentUser());
final AuthenticationMethod authenticationMethod = AuthenticationMethod.KERBEROS;
userGroupInformation.setAuthenticationMethod(authenticationMethod);
userGroupInformation.doAs(new PrivilegedExceptionAction<UserGroupInformation>() {
@Override
public UserGroupInformation run() throws Exception {
System.out.println("here ::::: " + UserGroupInformation.getCurrentUser());
final Configuration configurationInner = new Configuration();
configurationInner.set("hadoop.security.auth_to_local", "RULE:[1:$1@$0](.*@\\Q\\E$)s/@\\Q\\E$//" + "RULE:[2:$1@$0](.*@\\Q\\E$)s/@\\Q\\E$//" + "DEFAULT");
configurationInner.set("fs.defaultFS", "hdfs://host:9000");
configurationInner.set("hadoop.security.authentication", "kerberos");
configuration.set("hadoop.security.authorization", "true");
configurationInner.set("dfs.namenode.kerberos.principal", USER_NAME);
final FileSystem fs = FileSystem.get(configurationInner);
FileStatus[] status = fs.listStatus(new Path("hdfs://host:9000/user/abcdefg/sandpit/"));
for (int i = 0; i < status.length; i++) {
BufferedReader br = new BufferedReader(new InputStreamReader(fs.open(status[i].getPath())));
String line;
line = br.readLine();
while (line != null) {
System.out.println(line);
line = br.readLine();
}
}
System.out.println("--- FINISH ----");
return UserGroupInformation.getCurrentUser();
}
});
}
}
我无法修复此异常
Caused by: java.lang.IllegalArgumentException: Kerberos principal name does NOT have the expected hostname part: abcdefg
at org.apache.hadoop.security.SaslRpcClient.getServerPrincipal(SaslRpcClient.java:327)
at org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:231)
at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:159)
at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:396)
at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:553)
at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:368)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:722)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:718)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:717)
... 28 more
我应该使用什么hadoop属性来设置我的kerberos主体上的主机名?