线程“main”中的异常java.lang.VerifyError:操作数堆栈上的错误类型

时间:2015-02-14 12:59:18

标签: java hadoop mapreduce

在给定input.txt文件中查找max-temperature的map-reduce程序中发生了此错误。我写了两个专栏,分别是年份和温度。

    Exception in thread "main" java.lang.VerifyError: Bad type on operand stack
Exception Details:
  Location:
    org/apache/hadoop/mapred/JobTrackerInstrumentation.create(Lorg/apache/hadoop/mapred/JobTracker;Lorg/apache/hadoop/mapred/JobConf;)Lorg/apache/hadoop/mapred/JobTrackerInstrumentation; @5: invokestatic
  Reason:
    Type 'org/apache/hadoop/metrics2/lib/DefaultMetricsSystem' (current frame, stack[2]) is not assignable to 'org/apache/hadoop/metrics2/MetricsSystem'
  Current Frame:
    bci: @5
    flags: { }
    locals: { 'org/apache/hadoop/mapred/JobTracker', 'org/apache/hadoop/mapred/JobConf' }
    stack: { 'org/apache/hadoop/mapred/JobTracker', 'org/apache/hadoop/mapred/JobConf', 'org/apache/hadoop/metrics2/lib/DefaultMetricsSystem' }
  Bytecode:
    0000000: 2a2b b200 03b8 0004 b0                 

    at org.apache.hadoop.mapred.LocalJobRunner.<init>(LocalJobRunner.java:422)
    at org.apache.hadoop.mapred.JobClient.init(JobClient.java:488)
    at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:473)
    at org.apache.hadoop.mapreduce.Job$1.run(Job.java:513)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
    at org.apache.hadoop.mapreduce.Job.connect(Job.java:511)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:499)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
    at com.letsdobigdata.MaxTemperatureDriver.run(MaxTemperatureDriver.java:35)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at com.letsdobigdata.MaxTemperatureDriver.main(MaxTemperatureDriver.java:41)

2 个答案:

答案 0 :(得分:0)

请尝试将JDK版本降低到7.并从oracle网站下载JDK。

我可以用JDK 7解决问题。

答案 1 :(得分:-3)

它基本上是JVM执行堆栈上的类型不匹配

绝对是Hadoop中的一个错误。