这段代码有什么问题?我无法逃避任务不可序列化
@throws(classOf[Exception])
override def setUp(cfg: BenchmarkConfiguration) {
super.setUp(cfg)
sc = new SparkContext("local[4]", "BenchmarkTest")
sqlContext = new HiveContext(sc)
ic = new IgniteContext[RddKey, RddVal](sc,
() ⇒ configuration("client", client = true))
icCache = ic.fromCache(PARTITIONED_CACHE_NAME)
icCache.savePairs( sc.parallelize({
(0 until 1000).map{ n => (n.toLong, s"Value for key $n")}
}, 10)) // Error happens here: this is "line 89"
println(icCache.collect)
}
这是ST:
<20:47:45><yardstick> Failed to start benchmark server (will stop and exit).
org.apache.spark.SparkException: Task not serializable
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:166)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:158)
at org.apache.spark.SparkContext.clean(SparkContext.scala:1623)
at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:805)
at org.apache.ignite.spark.IgniteRDD.savePairs(IgniteRDD.scala:170)
at org.yardstickframework.spark.SparkAbstractBenchmark.setUp(SparkAbstractBenchmark.scala:89)
at org.yardstickframework.spark.SparkCoreRDDBenchmark.setUp(SparkCoreRDDBenchmark.scala:18)
at org.yardstickframework.spark.SparkCoreRDDBenchmark$.main(SparkCoreRDDBenchmark.scala:72)
at org.yardstickframework.spark.SparkNode.start(SparkNode.scala:28)
at org.yardstickframework.BenchmarkServerStartUp.main(BenchmarkServerStartUp.java:61)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.serializer.SerializationDebugger$ObjectStreamClassMethods$.getObjFieldValues$extension(SerializationDebugger.scala:240)
答案 0 :(得分:0)
看起来您的代码是针对不同版本的scala编译的
编译了点火或火花模块。我有类似的例外
测试我的代码是否针对scala 2.10
进行编译并且spark正在运行
scala 2.11
或反之亦然。模块com.databricks:spark-csv_2.10:1.1.0
可能
是这个原因。