当我尝试在Spark 1.4.1上执行以下操作时,我得到org.apache.spark.SparkException: Task not serializable
:
import java.sql.{Date, Timestamp}
import java.text.SimpleDateFormat
object ConversionUtils {
val iso8601 = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSSX")
def tsUTC(s: String): Timestamp = new Timestamp(iso8601.parse(s).getTime)
val castTS = udf[Timestamp, String](tsUTC _)
}
val df = frame.withColumn("ts", ConversionUtils.castTS(frame("ts_str")))
df.first
此处,frame
是DataFrame
,位于HiveContext
内。该数据框没有任何问题。
我有类似的整数UDF,它们没有任何问题。但是,带有时间戳的那个似乎会引起问题。根据{{3}},java.sql.TimeStamp
实现Serializable
,所以这不是问题所在。 SimpleDateFormat
也是如此documentation。
这让我相信这是导致问题的UDF。但是,我不确定是什么以及如何解决它。
追踪的相关部分:
Caused by: java.io.NotSerializableException: ...
Serialization stack:
- object not serializable (class: ..., value: ...$ConversionUtils$@63ed11dd)
- field (class: ...$ConversionUtils$$anonfun$3, name: $outer, type: class ...$ConversionUtils$)
- object (class ...$ConversionUtils$$anonfun$3, <function1>)
- field (class: org.apache.spark.sql.catalyst.expressions.ScalaUdf$$anonfun$2, name: func$2, type: interface scala.Function1)
- object (class org.apache.spark.sql.catalyst.expressions.ScalaUdf$$anonfun$2, <function1>)
- field (class: org.apache.spark.sql.catalyst.expressions.ScalaUdf, name: f, type: interface scala.Function1)
- object (class org.apache.spark.sql.catalyst.expressions.ScalaUdf, scalaUDF(ts_str#2683))
- field (class: org.apache.spark.sql.catalyst.expressions.Alias, name: child, type: class org.apache.spark.sql.catalyst.expressions.Expression)
- object (class org.apache.spark.sql.catalyst.expressions.Alias, scalaUDF(ts_str#2683) AS ts#7146)
- element of array (index: 35)
- array (class [Ljava.lang.Object;, size 36)
- field (class: scala.collection.mutable.ArrayBuffer, name: array, type: class [Ljava.lang.Object;)
- object (class scala.collection.mutable.ArrayBuffer,
答案 0 :(得分:24)
尝试:
object ConversionUtils extends Serializable {
...
}