Spark ClassCastException无法在Scala 2.10.5上将FiniteDuration的实例分配给字段RpcTimeout.duration

时间:2018-04-05 10:43:36

标签: java scala apache-spark

我尝试提交工作时遇到此异常。怎么试试? JAR在Scala 2.10.5上编译并使用

kafka_2.10-0.8.2.0.jar,

kafka-clients-0.8.2.0.jar

以下是异常的完整堆栈跟踪

  

java.lang.ClassCastException:无法将scala.concurrent.duration.FiniteDuration的实例分配给org.apache.spark实例中scala.concurrent.duration.FiniteDuration类型的字段org.apache.spark.rpc.RpcTimeout.duration。 .rpc.RpcTimeout       at java.io.ObjectStreamClass $ FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133)~ [na:1.8.0_74]       at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1305)~ [na:1.8.0_74]       在java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2006)〜[na:1.8.0_74]       在java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)〜[na:1.8.0_74]       at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)~ [na:1.8.0_74]       在java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)〜[na:1.8.0_74]       at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)〜[na:1.8.0_74]       在java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)〜[na:1.8.0_74]       at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)~ [na:1.8.0_74]       在java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)〜[na:1.8.0_74]       at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)〜[na:1.8.0_74]       在java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)〜[na:1.8.0_74]       at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)~ [na:1.8.0_74]       在java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)〜[na:1.8.0_74]       在java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)〜[na:1.8.0_74]       在org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)〜[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0]       在org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:109)〜[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0]       在org.apache.spark.rpc.netty.NettyRpcEnv $$ anonfun $ deserialize $ 1 $$ anonfun $ apply $ 1.apply(NettyRpcEnv.scala:261)〜[spark-core_2.10-1.6.0-cdh5.12.0.jar :1.6.0-cdh5.12.0]       在scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)〜[correctedViewershipUserProfile-1.14-SNAPSHOT-jar-with-dependencies.jar:na]       在org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:313)〜[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0]       在org.apache.spark.rpc.netty.NettyRpcEnv $$ anonfun $ deserialize $ 1.apply(NettyRpcEnv.scala:260)〜[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5 .12.0]       在scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)〜[correctedViewershipUserProfile-1.14-SNAPSHOT-jar-with-dependencies.jar:na]       at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:259)〜[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0]       在org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:590)〜[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0]       在org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:572)〜[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0]       在org.apache.spark.network.sasl.SaslRpcHandler.receive(SaslRpcHandler.java:80)〜[spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0]       在org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:154)[spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0]       在org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:102)[spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0]       在org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)[spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0]       在org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)[spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0]

1 个答案:

答案 0 :(得分:0)

您使用阴影罐吗? 您可以尝试从kafka_2.10中排除scala库。