Spark Streaming升级到2.1.0会抛出java.lang.VerifyError:分支目标152处的堆栈映射帧不一致

时间:2017-01-10 19:52:35

标签: apache-spark guava spark-streaming

我将Spark从1.6.1升级到2.1.0,Spark Streaming也是如此。我使用spark-rabbitmq库进行流式传输。这从0.3.0升级到0.5.1。

当我在Standalone Spark PreBuilt-Hadoop-2.7版本上使用spark-submit运行我的Spark Streaming作业时,我得到以下错误 -

2017-01-09 12:32:17 ERROR Executor:91 - Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.VerifyError: Inconsistent stackmap frames at branch target 152
Exception Details:
Location:
akka/dispatch/Mailbox.processAllSystemMessages()V @152: getstatic
Reason:
Type top (current frame, locals[9]) is not assignable to 'akka/dispatch/sysmsg/SystemMessage' (stack map, locals[9])
Current Frame:
bci: @131
flags: { }
locals: { 'akka/dispatch/Mailbox', 'java/lang/InterruptedException', 'akka/dispatch/sysmsg/SystemMessage', top, 'akka/dispatch/Mailbox', 'java/lang/Throwable', 'java/lang/Throwable' }
stack: { integer }
Stackmap Frame:
bci: @152
flags: { }
locals: { 'akka/dispatch/Mailbox', 'java/lang/InterruptedException', 'akka/dispatch/sysmsg/SystemMessage', top, 'akka/dispatch/Mailbox', 'java/lang/Throwable', 'java/lang/Throwable', top, top, 'akka/dispatch/sysmsg/SystemMessage' }
stack: { }
Bytecode:
0x0000000: 014c 2ab2 0132 b601 35b6 0139 4db2 013e
0x0000010: 2cb6 0142 9900 522a b600 c69a 004b 2c4e
0x0000020: b201 3e2c b601 454d 2db9 0148 0100 2ab6
0x0000030: 0052 2db6 014b b801 0999 000e bb00 e759
0x0000040: 1301 4db7 010f 4cb2 013e 2cb6 0150 99ff
0x0000050: bf2a b600 c69a ffb8 2ab2 0132 b601 35b6
0x0000060: 0139 4da7 ffaa 2ab6 0052 b600 56b6 0154
0x0000070: b601 5a3a 04a7 0091 3a05 1905 3a06 1906
0x0000080: c100 e799 0015 1906 c000 e73a 0719 074c
0x0000090: b200 f63a 08a7 0071 b201 5f19 06b6 0163
0x00000a0: 3a0a 190a b601 6899 0006 1905 bf19 0ab6
0x00000b0: 016c c000 df3a 0b2a b600 52b6 0170 b601
0x00000c0: 76bb 000f 5919 0b2a b600 52b6 017a b601
0x00000d0: 80b6 0186 2ab6 018a bb01 8c59 b701 8e13
0x00000e0: 0190 b601 9419 09b6 0194 1301 96b6 0194
0x00000f0: 190b b601 99b6 0194 b601 9ab7 019d b601
0x0000100: a3b2 00f6 3a08 b201 3e2c b601 4299 0026
0x0000110: 2c3a 09b2 013e 2cb6 0145 4d19 09b9 0148
0x0000120: 0100 1904 2ab6 0052 b601 7a19 09b6 01a7
0x0000130: a7ff d62b c600 09b8 0109 572b bfb1
Exception Handler Table:
bci [290, 307] => handler: 120
Stackmap Table:
append_frame(@13,Object[#231],Object[#177])
append_frame(@71,Object[#177])
chop_frame(@102,1)
full_frame(@120,{Object[#2],Object[#231],Object[#177],Top,Object[#2],Object[#177]},{Object[#223]})
full_frame(@152,{Object[#2],Object[#231],Object[#177],Top,Object[#2],Object[#223],Object[#223],Top,Top,Object[#177]},{})
append_frame(@173,Object[#357])
full_frame(@262,{Object[#2],Object[#231],Object[#177],Top,Object[#2]},{})
same_frame(@307)
same_frame(@317)

at akka.dispatch.Mailboxes.<init>(Mailboxes.scala:33)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:628)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:109)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:100)
at org.apache.spark.streaming.rabbitmq.receiver.RabbitMQReceiver.onStart(RabbitMQInputDStream.scala:57)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:607)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:597)
at org.apache.spark.SparkContext$$anonfun$34.apply(SparkContext.scala:2021)
at org.apache.spark.SparkContext$$anonfun$34.apply(SparkContext.scala:2021)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

我的pom.xml文件如下    ...

    <properties>
        <slf4j.version>1.7.7</slf4j.version>
        <log4j.version>1.2.17</log4j.version>
        <mapr.hbase.version>5.0.0-mapr</mapr.hbase.version>
        <guava.version>19.0</guava.version>
    </properties>
....
<dependency>
            <groupId>com.google.guava</groupId>
            <artifactId>guava</artifactId>
            <version>${guava.version}</version>
        </dependency>

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.1.0</version>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming_2.11</artifactId>
    <version>2.1.0</version>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>com.stratio.receiver</groupId>
    <artifactId>spark-rabbitmq</artifactId>
    <version>0.5.1</version>
</dependency>
</dependencies>

    <build>
.....
.....
<plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>2.4.3</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                    <relocations>

<relocation>

<pattern>com.google</pattern>

<shadedPattern>shadeio</shadedPattern>

<includes>

<include>com.google.**</include>

</includes>

</relocation>

</relocations>


                        <transformers>
                            <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
                        </transformers>

如果我删除番石榴的阴影插件,那么代码可以工作。但是,如果我添加它,它会抛出错误。我需要一个新版本的Guava - 19.0用于我的一个Streaming作业,因此需要对这个插件进行着色。

如何解决此问题的任何建议。

2 个答案:

答案 0 :(得分:0)

您应该从依赖项中排除那些具有不同的,冲突的番石榴版本。

具有此类排除的依赖性(在本例中为Kafka)的示例:

    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka-clients</artifactId>
        <exclusions>
            <exclusion>
                <artifactId>google-collections</artifactId>
                <groupId>google-collections</groupId>
            </exclusion>
            <exclusion>
                <artifactId>guava</artifactId>
                <groupId>com.google.guava</groupId>
            </exclusion>
        </exclusions>
        <version>${kafka.version}</version>
    </dependency>

另外,我建议你为某些Akka配置文件使用变换器,这些文件可能会给你带来阴影问题:

    <transformers>
        <transformer
            implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
            <resource>reference.conf</resource>
        </transformer>
    </transformers>

答案 1 :(得分:0)

我最近遇到了完全相同的问题(但是使用了Flink项目)。它是maven-shade-plugin导致此错误。 将maven-shade-plugin版本2.4.3升级到3.1.0应该可以解决这个问题。