我正在尝试通过笔记本电脑上安装的intelliJ在本地连接hdfs。我试图连接的群集使用边缘节点进行了Kerberized。我为边缘节点生成了一个密钥表,并在下面的代码中对其进行了配置。我现在可以登录到edgenode。但是,当我现在尝试访问namenode上的hdfs数据时,会引发错误。 下面是试图连接到hdfs的Scala代码:
import org.apache.spark.sql.SparkSession
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.{FileSystem, Path}
import org.apache.hadoop.security.{Credentials, UserGroupInformation}
import org.apache.hadoop.security.token.{Token, TokenIdentifier}
import java.security.{AccessController, PrivilegedAction, PrivilegedExceptionAction}
import java.io.PrintWriter
object DataframeEx {
def main(args: Array[String]) {
// $example on:init_session$
val spark = SparkSession
.builder()
.master(master="local")
.appName("Spark SQL basic example")
.config("spark.some.config.option", "some-value")
.getOrCreate()
runHdfsConnect(spark)
spark.stop()
}
def runHdfsConnect(spark: SparkSession): Unit = {
System.setProperty("HADOOP_USER_NAME", "m12345")
val path = new Path("/data/interim/modeled/abcdef")
val conf = new Configuration()
conf.set("fs.defaultFS", "hdfs://namenodename.hugh.com:8020")
conf.set("hadoop.security.authentication", "kerberos")
conf.set("dfs.namenode.kerberos.principal.pattern","hdfs/_HOST@HUGH.COM")
UserGroupInformation.setConfiguration(conf);
val ugi=UserGroupInformation.loginUserFromKeytabAndReturnUGI("m12345@HUGH.COM","C:\\Users\\m12345\\Downloads\\m12345.keytab");
println(UserGroupInformation.isSecurityEnabled())
ugi.doAs(new PrivilegedExceptionAction[String] {
override def run(): String = {
val fs= FileSystem.get(conf)
val output = fs.create(path)
val writer = new PrintWriter(output)
try {
writer.write("this is a test")
writer.write("\n")
}
finally {
writer.close()
println("Closed!")
}
"done"
}
})
}
}
我能够登录到edgenode。但是当Iam尝试写入hdfs(doAs方法)时,会引发以下错误:
WARN Client: Exception encountered while connecting to the server : java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM
18/06/11 12:12:01 ERROR UserGroupInformation: PriviledgedActionException m12345@HUGH.COM (auth:KERBEROS) cause:java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM
18/06/11 12:12:01 ERROR UserGroupInformation: PriviledgedActionException as:m12345@HUGH.COM (auth:KERBEROS) cause:java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM; Host Details : local host is: "INMBP-m12345/172.29.155.52"; destination host is: "namenodename.hugh.com":8020;
Exception in thread "main" java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM; Host Details : local host is: "INMBP-m12345/172.29.155.52"; destination host is: "namenodename.hugh.com":8020
如果我登录到edgenode并执行kinit,然后正常访问hdfs。那为什么我登录边缘节点后为什么不能访问hdfs namenode? p>
让我知道我身边是否需要更多细节。
答案 0 :(得分:1)
Spark conf对象设置不正确。以下是对我有用的东西:
val conf = new Configuration()
conf.set("fs.defaultFS", "hdfs://namenodename.hugh.com:8020")
conf.set("hadoop.security.authentication", "kerberos")
conf.set("hadoop.rpc.protection", "privacy") ***---(was missing this parameter)***
conf.set("dfs.namenode.kerberos.principal","hdfs/_HOST@HUGH.COM") ***---(this was initially wrongly set as dfs.namenode.kerberos.principal.pattern)***