使用Ranger时,Sqoop无法导入数据

时间:2018-03-05 10:19:34

标签: hadoop sqoop apache-ranger

我使用HDP-2.6,Ranger启用了管理员权限

我在Ranger中添加了一个策略,以获得对纱线/数据的完全许可(读取,写入,执行),并使用“递归”#39;启用

我想使用Sqoop将数据从mysql导入到hive。但每一次,我都有以下例外:


30998 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  - Loading data to table ods.test_table
30998 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  - Loading data to table ods.test_table
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  - Failed with exception org.apache.hadoop.security.AccessControlException: Permission denied: user=yarn, access=EXECUTE, inode="/data/hive/warehouse/ods/test_table/part-m-00000.gz":admin:hadoop:drwx------
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  - Failed with exception org.apache.hadoop.security.AccessControlException: Permission denied: user=yarn, access=EXECUTE, inode="/data/hive/warehouse/ods/test_table/part-m-00000.gz":admin:hadoop:drwx------
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:353)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:353)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:292)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:292)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:238)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:238)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkDefaultEnforcer(RangerHdfsAuthorizer.java:428)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkDefaultEnforcer(RangerHdfsAuthorizer.java:428)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:365)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:365)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1950)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1950)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1934)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1934)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkOwner(FSDirectory.java:1903)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkOwner(FSDirectory.java:1903)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSDirAttrOp.setPermission(FSDirAttrOp.java:63)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSDirAttrOp.setPermission(FSDirAttrOp.java:63)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1850)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1850)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:821)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:821)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:465)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:465)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
31222 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at java.security.AccessController.doPrivileged(Native Method)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at java.security.AccessController.doPrivileged(Native Method)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at javax.security.auth.Subject.doAs(Subject.java:422)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at javax.security.auth.Subject.doAs(Subject.java:422)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)
31223 [Thread-28] INFO  org.apache.sqoop.hive.HiveImport  -     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)

起初我认为游侠的许可不起作用,但在Range审核日志中,它显示了这一点:


{"repoType":1,"repo":"xstudy_hadoop","reqUser":"yarn","evtTime":"2018-03-05 17:33:18.446","access":"WRITE","resource":"/data/hive/warehouse/ods/test_table/part-m-00002.gz","resType":"path","action":"write","result":1,"policy":10,"reason":"/data/hive/warehouse/ods/test_table","enforcer":"ranger-acl","cliIP":"10.0.30.2","agentHost":"master","logType":"RangerAudit","id":"727c87c5-eeba-465e-ad8d-f1129c01801f-269954","seq_num":433117,"event_count":1,"event_dur_ms":0,"tags":[],"cluster_name":"xstudy"}
{"repoType":1,"repo":"xstudy_hadoop","reqUser":"yarn","evtTime":"2018-03-05 17:33:18.446","access":"WRITE","resource":"/data/hive/warehouse/ods/test_table/part-m-00002.gz","resType":"path","action":"write","result":1,"policy":10,"reason":"/data/hive/warehouse/ods/test_table","enforcer":"ranger-acl","cliIP":"10.0.30.2","agentHost":"master","logType":"RangerAudit","id":"727c87c5-eeba-465e-ad8d-f1129c01801f-269955","seq_num":433119,"event_count":1,"event_dur_ms":0,"tags":[],"cluster_name":"xstudy"}
{"repoType":1,"repo":"xstudy_hadoop","reqUser":"yarn","evtTime":"2018-03-05 17:33:18.447","access":"READ","resource":"/data/hive/warehouse/ods/test_table","resType":"path","action":"read","result":1,"policy":10,"reason":"/data/hive/warehouse/ods/test_table","enforcer":"ranger-acl","cliIP":"10.0.30.2","agentHost":"master","logType":"RangerAudit","id":"727c87c5-eeba-465e-ad8d-f1129c01801f-269956","seq_num":433121,"event_count":1,"event_dur_ms":0,"tags":[],"cluster_name":"xstudy"}
{"repoType":1,"repo":"xstudy_hadoop","reqUser":"yarn","evtTime":"2018-03-05 17:33:18.447","access":"EXECUTE","resource":"/data/hive/warehouse/ods/test_table/part-m-00000.gz","resType":"path","action":"execute","result":0,"policy":-1,"reason":"/data/hive/warehouse/ods/test_table","enforcer":"hadoop-acl","cliIP":"10.0.30.2","agentHost":"master","logType":"RangerAudit","id":"727c87c5-eeba-465e-ad8d-f1129c01801f-269957","seq_num":433123,"event_count":1,"event_dur_ms":0,"tags":[],"cluster_name":"xstudy"}
{"repoType":1,"repo":"xstudy_hadoop","reqUser":"yarn","evtTime":"2018-03-05 17:33:18.448","access":"EXECUTE","resource":"/data/hive/warehouse/ods/test_table/part-m-00001.gz","resType":"path","action":"execute","result":0,"policy":-1,"reason":"/data/hive/warehouse/ods/test_table","enforcer":"hadoop-acl","cliIP":"10.0.30.2","agentHost":"master","logType":"RangerAudit","id":"727c87c5-eeba-465e-ad8d-f1129c01801f-269958","seq_num":433125,"event_count":1,"event_dur_ms":0,"tags":[],"cluster_name":"xstudy"}
{"repoType":1,"repo":"xstudy_hadoop","reqUser":"yarn","evtTime":"2018-03-05 17:33:18.448","access":"EXECUTE","resource":"/data/hive/warehouse/ods/test_table/part-m-00002.gz","resType":"path","action":"execute","result":0,"policy":-1,"reason":"/data/hive/warehouse/ods/test_table","enforcer":"hadoop-acl","cliIP":"10.0.30.2","agentHost":"master","logType":"RangerAudit","id":"727c87c5-eeba-465e-ad8d-f1129c01801f-269959","seq_num":433127,"event_count":1,"event_dur_ms":0,"tags":[],"cluster_name":"xstudy"}
{"repoType":1,"repo":"xstudy_hadoop","reqUser":"yarn","evtTime":"2018-03-05 17:33:18.449","access":"WRITE","resource":"/data/hive/warehouse/ods/test_table/part-m-00003.gz","resType":"path","action":"write","result":1,"policy":10,"reason":"/data/hive/warehouse/ods/test_table","enforcer":"ranger-acl","cliIP":"10.0.30.2","agentHost":"master","logType":"RangerAudit","id":"727c87c5-eeba-465e-ad8d-f1129c01801f-269960","seq_num":433129,"event_count":1,"event_dur_ms":0,"tags":[],"cluster_name":"xstudy"}
{"repoType":1,"repo":"xstudy_hadoop","reqUser":"yarn","evtTime":"2018-03-05 17:33:18.449","access":"WRITE","resource":"/data/hive/warehouse/ods/test_table/part-m-00003.gz","resType":"path","action":"write","result":1,"policy":10,"reason":"/data/hive/warehouse/ods/test_table","enforcer":"ranger-acl","cliIP":"10.0.30.2","agentHost":"master","logType":"RangerAudit","id":"727c87c5-eeba-465e-ad8d-f1129c01801f-269961","seq_num":433131,"event_count":1,"event_dur_ms":0,"tags":[],"cluster_name":"xstudy"}
{"repoType":1,"repo":"xstudy_hadoop","reqUser":"yarn","evtTime":"2018-03-05 17:33:18.451","access":"EXECUTE","resource":"/data/hive/warehouse/ods/test_table/part-m-00003.gz","resType":"path","action":"execute","result":0,"policy":-1,"reason":"/data/hive/warehouse/ods/test_table","enforcer":"hadoop-acl","cliIP":"10.0.30.2","agentHost":"master","logType":"RangerAudit","id":"727c87c5-eeba-465e-ad8d-f1129c01801f-269962","seq_num":433133,"event_count":1,"event_dur_ms":0,"tags":[],"cluster_name":"xstudy"}

从审核日志中,纱线只是不能拥有执行权限,为什么?我100%确定已获得许可。

我有两个问题:

  1. 为什么纱线无法执行'文件夹/数据的权限?
  2. Sqoop已导入数据,为什么纱线要求执行'许可?
  3. ======添加更多信息=====

    今天,我尝试在群集中的Linux机器上运行sqoop命令,命令完全相同,参数相同,执行成功,en .....

1 个答案:

答案 0 :(得分:0)

我认为问题是目录Route::get('/', function () { // })->middleware('first', 'second'); 的所有者是Route::group(['middleware' => ['first']], function () { // }); 组成员的用户/data/hive/warehouse/ods/test_table/。由于admin,用户hadoop无权在那里执行作业。

尝试更改目录的权限。运行命令:

yarn

尝试重新运行你的工作。