无法在主URL上启动Spark

时间:2018-08-07 22:31:40

标签: maven apache-spark spring-data

我正在用spark和Cassandra构建REST微服务,并提供了本地的spark master价值,并且运行良好。

但是当我尝试将spark主URL提供为“ spark:// ip:7077”时,启动其余服务时会显示以下错误:

Caused by: java.io.IOException: Failed to send RPC 5099964663881645984 to /98.8.150.125:7077: java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
    at org.apache.spark.network.client.TransportClient.lambda$sendRpc$2(TransportClient.java:237) ~[spark-network-common_2.11-2.2.2.jar!/:2.2.2]
    at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:511) ~[netty-common-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:485) ~[netty-common-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:424) ~[netty-common-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:121) ~[netty-common-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.util.internal.PromiseNotificationUtil.tryFailure(PromiseNotificationUtil.java:64) ~[netty-common-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext.notifyOutboundHandlerException(AbstractChannelHandlerContext.java:837) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:740) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:816) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:302) ~[netty-handler-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext.access$1900(AbstractChannelHandlerContext.java:38) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.write(AbstractChannelHandlerContext.java:1081) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1128) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1070) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) ~[netty-common-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404) ~[netty-common-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884) ~[netty-common-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.24.Final.jar!/:4.1.24.Final]
    ... 1 common frames omitted

Caused by: java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
    at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:77) ~[netty-common-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:116) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:810) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:111) ~[netty-codec-4.1.24.Final.jar!/:4.1.24.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738) ~[netty-transport-4.1.24.Final.jar!/:4.1.24.Final]
    ... 16 common frames omitted

我正在为我的休息服务使用以下spark和cassandra依赖项:

<dependencies>
    <dependency>
        <groupId>org.springframework.data</groupId>
        <artifactId>spring-data-cassandra</artifactId>
    </dependency>


    <dependency>
        <groupId>com.datastax.cassandra</groupId>
        <artifactId>cassandra-driver-core</artifactId>
        <version>${cassandra.version}</version>
    </dependency>

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>${spark.sql.version}</version>
    </dependency>

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>${spark.sql.version}</version>
    </dependency>

    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector-unshaded_2.11</artifactId>
        <version>${spark.cassandra.connector.unshaded.version}</version>
    </dependency>
</dependencies>

我还尝试在spark conf的spark-env.sh中提供spark master URL,但没有用。有人遇到过类似的问题吗? 任何帮助表示赞赏。

1 个答案:

答案 0 :(得分:0)

找到了它(经过如此多的命中,试验和研究)。这是spark 2.2.2的问题。 Spark仍然支持netty 4.0版本,并且由于它内部依赖于我的应用程序中的netty 4.1库,因此两个版本之间存在冲突。我刚刚添加了netty 4.0依赖关系,并且现在可以正常工作了。

有关此问题的更多详细信息: https://issues.apache.org/jira/browse/SPARK-21143?jql=project%20%3D%20SPARK%20AND%20text%20~%20abstractmethoderror