如何创建地图数据集?

时间:2017-10-16 20:33:50

标签: scala apache-spark apache-spark-sql apache-spark-dataset apache-spark-encoders

我正在使用Spark 2.2,并且在spark.createDataset Seq Map上尝试拨打// createDataSet on Seq[T] where T = Int works scala> spark.createDataset(Seq(1, 2, 3)).collect res0: Array[Int] = Array(1, 2, 3) scala> spark.createDataset(Seq(Map(1 -> 2))).collect <console>:24: error: Unable to find encoder for type stored in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases. spark.createDataset(Seq(Map(1 -> 2))).collect ^ // createDataSet on a custom case class containing Map works scala> case class MapHolder(m: Map[Int, Int]) defined class MapHolder scala> spark.createDataset(Seq(MapHolder(Map(1 -> 2)))).collect res2: Array[MapHolder] = Array(MapHolder(Map(1 -> 2))) 时遇到了麻烦。

我的Spark Shell会话的代码和输出如下:

import spark.implicits._

我已经尝试了override func perform(_ aSelector: Selector!) -> Unmanaged<AnyObject>! { //CODE },虽然我很确定它是由Spark shell会话隐式导入的。

这是当前编码器未涵盖的情况吗?

2 个答案:

答案 0 :(得分:7)

2.2未涵盖,但可以轻松解决。您可以使用Encoder明确地添加所需的ExpressionEncoder

import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder  
import org.apache.spark.sql.Encoder

spark
  .createDataset(Seq(Map(1 -> 2)))(ExpressionEncoder(): Encoder[Map[Int, Int]])

implicitly

implicit def mapIntIntEncoder: Encoder[Map[Int, Int]] = ExpressionEncoder()
spark.createDataset(Seq(Map(1 -> 2)))

答案 1 :(得分:2)

仅供参考,上述表达式仅适用于Spark 2.3(截至this commit,如果我没有记错的话)。

scala> spark.version
res0: String = 2.3.0

scala> spark.createDataset(Seq(Map(1 -> 2))).collect
res1: Array[scala.collection.immutable.Map[Int,Int]] = Array(Map(1 -> 2))

我认为这是因为newMapEncoder现在是spark.implicits的一部分。

scala> :implicits
...
  implicit def newMapEncoder[T <: scala.collection.Map[_, _]](implicit evidence$3: reflect.runtime.universe.TypeTag[T]): org.apache.spark.sql.Encoder[T]

您可以通过使用以下技巧“禁用”隐式,并尝试上述表达式(这将导致错误)。

trait ThatWasABadIdea
implicit def newMapEncoder(ack: ThatWasABadIdea) = ack

scala> spark.createDataset(Seq(Map(1 -> 2))).collect
<console>:26: error: Unable to find encoder for type stored in a Dataset.  Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._  Support for serializing other types will be added in future releases.
       spark.createDataset(Seq(Map(1 -> 2))).collect
                          ^