从本地spark 2x程序

时间:2018-05-08 06:06:31

标签: apache-spark exception hive apache-spark-sql

当我从eclipse运行本地spark2x程序时,我得到了错误:

  

线程“main”中的异常org.apache.spark.sql.AnalysisException:当设置hive.metastore.uris时,请将spark.sql.authorization.enabled和hive.security.authorization.enabled设置为true以启用授权;

使用的代码:

System.setProperty("hadoop.home.dir", "D:/winutils");
//kerberos releated
val ZKServerPrincipal = "zookeeper/hadoop.hadoop.com";
val ZOOKEEPER_DEFAULT_LOGIN_CONTEXT_NAME = "Client";
val ZOOKEEPER_SERVER_PRINCIPAL_KEY = "zookeeper.server.principal";
val hadoopConf: Configuration = new Configuration(); 
LoginUtil.setZookeeperServerPrincipal(ZOOKEEPER_SERVER_PRINCIPAL_KEY, ZKServerPrincipal);
LoginUtil.login(userPrincipal, userKeytabPath, krb5ConfPath, hadoopConf);
//creating spark session    
val spark = SparkSession .builder() .appName("conenction").config("spark.master", "local") .config("spark.sql.authorization.enabled","true") .enableHiveSupport() .getOrCreate() 
    val df75 = spark.sql("select * from dbname.tablename limit 10") 

0 个答案:

没有答案