Apache Hive作业不起作用 - 容器失败,exitCode = -1000。无法获得阻止

时间:2016-05-10 22:05:15

标签: mapreduce hive hortonworks-data-platform ambari

我刚刚使用Hortonworks数据平台安装了Hadoop。我有三台运行CentOS 7的计算机。这三台计算机中的一台正在运行amabari-server,NameNode,HiveServer2等。另外两个只运行这些服务的客户端。

每当我尝试执行需要MapReduce作业的Hive查询时,它们都会失败。所有TaskAttempts在每个作业中都失败,出现BlockMissingException并且诊断设置为" [Container failed,exitCode = -1000。无法获得阻止..."。

E.g:

hive> select count(*) from pgc;
Query ID = root_20160510184153_51d881b2-fbb5-47d3-8a06-9d62f51950e1
Total jobs = 1
Launching Job 1 out of 1


Status: Running (Executing on YARN cluster with App id application_1462904248344_0007)

--------------------------------------------------------------------------------
        VERTICES      STATUS  TOTAL  COMPLETED  RUNNING  PENDING  FAILED  KILLED
--------------------------------------------------------------------------------
Map 1                 FAILED      9          0        0        9      14       0
Reducer 2             KILLED      1          0        0        1       0       0
--------------------------------------------------------------------------------
VERTICES: 00/02  [>>--------------------------] 0%    ELAPSED TIME: 80.05 s
--------------------------------------------------------------------------------
Status: Failed
Vertex failed, vertexName=Map 1, vertexId=vertex_1462904248344_0007_1_00, diagnostics=[Task failed, taskId=task_1462904248344_0007_1_00_000001, diagnostics=[TaskAttempt 0 failed, info=[Container container_e49_1462904248344_0007_02_000003 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]], TaskAttempt 1 failed, info=[Container container_e49_1462904248344_0007_02_000009 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]], TaskAttempt 2 failed, info=[Container container_e49_1462904248344_0007_02_000013 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073759911_19135 file=/tmp/hive/root/_tez_session_dir/fe7f8921-1363-410f-9bcf-1ef2285fe369/hive-hcatalog-core.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073759911_19135 file=/tmp/hive/root/_tez_session_dir/fe7f8921-1363-410f-9bcf-1ef2285fe369/hive-hcatalog-core.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]], TaskAttempt 3 failed, info=[Container container_e49_1462904248344_0007_02_000018 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]]], Task failed, taskId=task_1462904248344_0007_1_00_000003, diagnostics=[TaskAttempt 0 failed, info=[Container container_e49_1462904248344_0007_02_000005 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]], TaskAttempt 1 failed, info=[Container container_e49_1462904248344_0007_02_000008 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]], TaskAttempt 2 failed, info=[Container container_e49_1462904248344_0007_02_000014 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073759911_19135 file=/tmp/hive/root/_tez_session_dir/fe7f8921-1363-410f-9bcf-1ef2285fe369/hive-hcatalog-core.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073759911_19135 file=/tmp/hive/root/_tez_session_dir/fe7f8921-1363-410f-9bcf-1ef2285fe369/hive-hcatalog-core.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]], TaskAttempt 3 failed, info=[Container container_e49_1462904248344_0007_02_000017 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:2 killedTasks:7, Vertex vertex_1462904248344_0007_1_00 [Map 1] killed/failed due to:OWN_TASK_FAILURE]
...

以前有人看过这个问题吗?提前谢谢。

1 个答案:

答案 0 :(得分:0)

在Linux终端中, su hdfs &运行 hadoop dfsadmin -report 以确认该块未损坏。

根据日志,我认为您以root用户身份运行查询,请尝试模拟root用户。登录Ambari UI,然后转到HDFS - >配置 - >高级 - >自定义核心站点

更新或添加属性

hadoop.proxyuser.root.groups = *

hadoop.proxyuser.root.hosts = *