我有一个要在spark中使用的自定义对象。
var myTuple : Seq[(String, MyCustomObject)] = Seq()
class MyCustomObject(var x: String) extends Serializable {
override def toString = s"$x"
@throws[IOException]
private def writeObject(out: ObjectOutputStream): Unit = {
out.writeObject(x)
}
@throws[IOException]
@throws[ClassNotFoundException]
private def readObject(in: ObjectInputStream): Unit = {
var myString = ""
in.readObject(myString)
x = myString;
}
}
在上面,我定义了一个自定义对象类,该类扩展了可序列化的内容,但仍然出现以下错误
java.lang.UnsupportedOperationException: No Encoder found for MyCustomObject
有人可以给我指明方向吗?
编辑:
我查看了重复项,但仍然无法正确获得以下内容
var myTuple : Seq[Seq((String, MyCustomObject))] = Seq()
我基本上已经添加
implicit def single[A](implicit c: ClassTag[A]): Encoder[A] = Encoders.kryo[A](c)
implicit def tuple2[A1, A2](
implicit e1: Encoder[A1],
e2: Encoder[A2]
): Encoder[(A1,A2)] = Encoders.tuple[A1,A2](e1, e2)