java.io.FileNotFoundException:HIVE_PLAN没有这样的文件或目录

时间:2014-06-16 07:23:06

标签: java hadoop hive

我在具有5个数据节点的群集上运行Hadoop-2.2.0 + Hive-0.13.0。 WordCount示例成功运行,可以在hive cli中创建表。 但是当我使用mapreduce作业运行hive查询时,我不断收到如下错误:

Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: java.io.FileNotFoundException: HIVE_PLAN7b8ea437-8ec3-4c05-af4e-3cd6466dce85 (No such file or directory)
    at org.apache.hadoop.hive.ql.exec.Utilities.getMapRedWork(Utilities.java:230)
    at org.apache.hadoop.hive.ql.io.HiveInputFormat.init(HiveInputFormat.java:255)
    at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:381)
    at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:374)
    at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:540)
    at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:167)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:408)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
Caused by: java.io.FileNotFoundException: HIVE_PLAN7b8ea437-8ec3-4c05-af4e-3cd6466dce85 (No such file or directory)
    at java.io.FileInputStream.open(Native Method)
    at java.io.FileInputStream.<init>(FileInputStream.java:146)
    at java.io.FileInputStream.<init>(FileInputStream.java:101)
    at org.apache.hadoop.hive.ql.exec.Utilities.getMapRedWork(Utilities.java:221)
    ... 12 more


FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched:
Job 0: Map: 1   HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec

提前感谢!

1 个答案:

答案 0 :(得分:0)

我终于找到了问题:我在使用hive-0.11编译的同一个集群上运行shark-0.9.1。当纱线开始时,它会读取导致错误的hive-0.11 jar文件!!

我已从yarn-site.xml中的yarn.application.classpath中删除了shark类路径,错误已修复!

谢谢