我正在尝试在Spark 1.4.0中配置基于SQL标准的授权,我通过添加以下属性为Hive 0.13.1做了。
<property>
<name>hive.security.authorization.enabled</name>
<value>true</value>
<description>enable or disable the Hive client authorization</description>
</property>
<property>
<name>hive.security.authorization.manager</name>
<value>org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizerFactory</value>
<description>The Hive client authorization manager class name. The user defined authorization class should implement interface org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider.</description>
</property>
<property>
<name>hive.security.authenticator.manager</name>
<value>org.apache.hadoop.hive.ql.security.SessionStateUserAuthenticator</value>
<description>hive client authenticator manager class name. The user defined authenticator should implement interface org.apache.hadoop.hive.ql.security.HiveAuthenticationProvider.</description>
</property>
<property>
<name>hive.users.in.admin.role</name>
<value>hduser</value>
<description>Comma separated list of users who are in admin role forbootstrapping. More users can be added in ADMIN role later.</description>
</property>
<property>
<name>hive.server2.enable.doAs</name>
<value>true</value>
<description>Setting this property to true will have HiveServer2 execute Hive operations as the user making the calls to it.</description>
</property>
在Hive中它很好,但在Spark中它不能正常工作意味着当我尝试为表设置一些规则或尝试创建一些角色时,返回一些异常。