spark自定义kryo编码器不提供​​UDF的架构

时间:2017-06-24 10:05:58

标签: apache-spark apache-spark-sql spark-dataframe kryo apache-spark-encoders

当跟随How to store custom objects in Dataset?并尝试为数据框注册我自己的kryo编码器时,我面临Schema for type com.esri.core.geometry.Envelope is not supported

的问题

有一个函数可以将String(WKT)解析为几何对象,如:

def mapWKTToEnvelope(wkt: String): Envelope = {
    val envBound = new Envelope()
    val spatialReference = SpatialReference.create(4326)
    // Parse the WKT String into a Geometry Object
    val ogcObj = OGCGeometry.fromText(wkt)
    ogcObj.setSpatialReference(spatialReference)
    ogcObj.getEsriGeometry.queryEnvelope(envBound)
    envBound
  }

这适用于UDF,如:

implicit val envelopeEncoder: Encoder[Envelope] = Encoders.kryo[Envelope]
val ST_Envelope = udf((wkt: String) => mapWKTToEnvelope(wkt))

但是,UDF将编译但抛出运行时错误:

[error] Exception in thread "main" java.lang.UnsupportedOperationException: Schema for type com.esri.core.geometry.Envelope is not supported
[error]         at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:733)
[error]         at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:671)
[error]         at org.apache.spark.sql.functions$.udf(functions.scala:3076)

修改

尽管

val first = df[(String, String)].first
val envBound = new Envelope()
val ogcObj = OGCGeometry.fromText(first._1)
ogcObj.setSpatialReference(spatialReference)
ogcObj.getEsriGeometry.queryEnvelope(envBound)
spark.createDataset(Seq((envBound)))(envelopeEncoder)

工作得很好:

root
 |-- value: binary (nullable = true)
+--------------------+
|               value|
+--------------------+
|[01 00 63 6F 6D 2...|
+--------------------+

我怎样才能让它在UDF中工作

0 个答案:

没有答案