发送结果RpcResponse /关闭连接时出错-Datastax Enterprise

时间:2019-05-24 13:10:26

标签: apache-spark datastax-enterprise

我正在通过Maven运行Scala应用程序,它使用spark-submit将胖子从客户端节点发送到Datastax Enterprise集群。 (在Azure上)。

一切似乎运行良好,它确实将作业提交给Spark Worker / Master,但是在某个时候,它开始连续不断地抛出该批,并且永远不会退出:

[rpc-server-17-1] ERROR org.apache.spark.network.server.TransportRequestHandler - Error sending result RpcResponse{requestId=6159268836916637242, body=NioManagedBuffer{buf=java.nio.HeapByteBuffer[pos=0 lim=47 cap=64]}} to /10.0.0.4:40852; closing connection
java.lang.AbstractMethodError: null
    at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:77)
    at io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:116)
    at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:810)
    at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
    at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:111)
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
    at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:816)
    at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
    at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:302)
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
    at io.netty.channel.AbstractChannelHandlerContext.access$1900(AbstractChannelHandlerContext.java:38)
    at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.write(AbstractChannelHandlerContext.java:1081)
    at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1128)
    at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1070)
    at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:465)
    at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.lang.Thread.run(Thread.java:748)

所有依赖项通常都可以下载,编译没有问题。我在pom.xml中有以下内容:

<dependency>
<groupId>com.datastax.dse</groupId>
<artifactId>dse-spark-dependencies</artifactId>
<version>6.7.1</version>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>com.datastax.dse</groupId>
<artifactId>dse-byos_2.11</artifactId>
<version>6.7.1</version>
<scope>provided</scope>
<exclusions><exclusion>
  <groupId>io.netty</groupId>
  <artifactId>*</artifactId>
</exclusion>
</exclusions>  
</dependency>

<dependency>
<groupId>com.datastax.dse</groupId>
<artifactId>spark-connector</artifactId>
<version>6.7.1</version>
<exclusions>
<exclusion>
  <groupId>org.apache.solr</groupId>
  <artifactId>solr-solrj</artifactId>
</exclusion>
<exclusion>
  <groupId>io.netty</groupId>
  <artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.datastax.oss</groupId>
<artifactId>java-driver-core-shaded</artifactId>
<version>4.0.0</version>
</dependency>
<!--<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-transport</artifactId>
<version>4.1.25.4.dse</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-transport-native-epoll</artifactId>
<version>4.1.25.Final</version>
<classifier>linux-x86_64</classifier>
</dependency> -->

我还尝试基于此https://docs.datastax.com/en/developer/java-driver/3.3/faq/传递该参数以使用FORCE_NIO,但这一点没有任何区别。

我什至尝试运行以下应用程序:

dse -u cassandra -p pass spark-submit --conf "spark.driver.extraClassPath=$(dse spark-classpath)" --class my.package.bde.TestSparkApp target/big-data-engine-0.0.1-jar-with-dependencies.jar -Dcom.datastax.driver.FORCE_NIO=true但引发了另一个错误:Caused by: java.lang.NoClassDefFoundError: Could not initialize class io.netty.channel.epoll.EpollEventLoop

1 个答案:

答案 0 :(得分:0)

您的构建确实具有不正确的依赖性,如先前答案中所指出的那样。您只需要离开

<dependency>
<groupId>com.datastax.dse</groupId>
<artifactId>dse-spark-dependencies</artifactId>
<version>6.7.1</version>
<scope>provided</scope>
</dependency>

并删除所有其他内容-byos,spark-connector,netty,java-driver-core等。