Hive插入权限被拒绝:user = root,access = WRITE

时间:2018-04-30 13:37:49

标签: hive

我在Hive中创建了一个名为example的表。

 CREATE TABLE example (id INT, name STRING, number STRING);

但是在尝试插入一些参数时会出现如下错误。

 Insert into table example values (1,'Sample Data','1234123412341234')
  

18/04/30 13:26:46 [HiveServer2-Background-Pool:Thread-40]:警告   security.UserGroupInformation:PriviledgedActionException as:root   (auth:SIMPLE)原因:org.apache.hadoop.security.AccessControlException:   权限被拒绝:user = root,access = WRITE,   索引节点= “/用户”:HDFS:超组的。drwxr-XR-X           at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)           在org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)           在org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)           at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)           在org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)           在org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)           在org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)           at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)           在org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)           在org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4503)           在org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4473)           在org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4446)           at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:882)           在org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:326)           at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:640)           at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos $ ClientNamenodeProtocol $ 2.callBlockingMethod(ClientNamenodeProtocolProtos.java)           在org.apache.hadoop.ipc.ProtobufRpcEngine $ Server $ ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)           在org.apache.hadoop.ipc.RPC $ Server.call(RPC.java:1073)           在org.apache.hadoop.ipc.Server $ Handler $ 1.run(Server.java:2281)           在org.apache.hadoop.ipc.Server $ Handler $ 1.run(Server.java:2277)           at java.security.AccessController.doPrivileged(Native Method)           在javax.security.auth.Subject.doAs(Subject.java:415)           在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)           在org.apache.hadoop.ipc.Server $ Handler.run(Server.java:2275)

感谢您的帮助

1 个答案:

答案 0 :(得分:1)

我找到了解决方案。

# su - hdfs

$ hdfs dfs -mkdir /user/root

$ hdfs dfs -chown root:hdfs /user/root

$ exit