我在运行mapreduce作业时遇到了这个问题:
19/02/25 03:06:41 INFO mapreduce.Job: Job job_1551092767354_0001 failed with state FAILED due to: Application application_1551092767354_0001 failed 2 times due to AM Container for appattempt_1551092767354_0001_000002 exited with exitCode: 1
For more detailed output, check application tracking page:http://quickstart.cloudera:8088/proxy/application_1551092767354_0001/Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1551092767354_0001_02_000001
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
我尝试将纱线类路径手动添加到yarn-site.xml,但是没有用。有没有人有办法解决吗?谢谢。
编辑:这是错误的日志,但我不确定这意味着什么:
错误[主要] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler:失败 检查历史记录中间完成目录的存在: [hdfs://quickstart.cloudera:8020 / user / history / done_intermediate] 2019-02-25 03:06:36,034信息[主要] org.apache.hadoop.service.AbstractService:服务 JobHistoryEventHandler失败,状态为INITED;原因: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: org.apache.hadoop.security.AccessControlException:权限被拒绝: 用户= cloudera,访问=写, inode =“ / user / history”:mapred:hadoop:drwxr-xr-x位于 org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257) 在 org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238) 在 org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:216) 在 org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:145) 在 org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138) 在 org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6596) 在 org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6578) 在 org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6530) 在 org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4334) 在 org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4304) 在 org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4277) 在 org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:852) 在 org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:321) 在 org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:601) 在 org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos $ ClientNamenodeProtocol $ 2.callBlockingMethod(ClientNamenodeProtocolProtos.java) 在 org.apache.hadoop.ipc.ProtobufRpcEngine $ Server $ ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619) 在org.apache.hadoop.ipc.RPC $ Server.call(RPC.java:1060)在 org.apache.hadoop.ipc.Server $ Handler $ 1.run(Server.java:2044)在 org.apache.hadoop.ipc.Server $ Handler $ 1.run(Server.java:2040)在 java.security.AccessController.doPrivileged(本机方法),位于 javax.security.auth.Subject.doAs(Subject.java:415)在 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) 在org.apache.hadoop.ipc.Server $ Handler.run(Server.java:2038)