在Scala中覆盖一般类型的方法时,如何返回具体类型?

时间:2016-08-31 20:20:16

标签: scala generics apache-spark scala-generics

请查看以下代码段:

import org.apache.spark.streaming.dstream.DStream
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.rdd.RDD
import org.apache.spark.streaming.Time
import org.apache.spark.streaming.Seconds

abstract class MQTTDStream[T <: Any](ssc: StreamingContext) extends DStream(ssc) {
  override def compute(validTime: Time): Option[RDD[T]] =
Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1)) //This line doesn't compile

  override def dependencies = Nil

  override def slideDuration = Seconds(1) // just an example
}

我收到以下错误:

  

类型不匹配;发现:Int(1)要求:T

我已宣布T扩展Any,那么为什么编译器会抱怨? Int是Any的子类型,不是吗?

非常感谢!

更新:2.9.16:

更改为从DStream [Int]扩展但仍然是相同的错误:

abstract class MQTTDStream[T](ssc: StreamingContext) extends DStream[Int](ssc) {
  override def compute(validTime: Time): Option[RDD[T]] =
Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1)) //This line doesn't compile

  override def dependencies = Nil

  override def slideDuration = Seconds(1) // just an example
}

编辑:2.9.16:

感谢Alexey,这是可行的解决方案:

import org.apache.spark.streaming.dstream.DStream
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.rdd.RDD
import org.apache.spark.streaming.Time
import org.apache.spark.streaming.Seconds

abstract class MQTTDStream[T](ssc: StreamingContext) extends DStream[Int](ssc) {
  override def compute(validTime: Time): Option[RDD[Int]] =
    Some(ssc.sparkContext.parallelize(Seq(1, 2, 3), 1))

  override def dependencies = Nil

  override def slideDuration = Seconds(1) // just an example
}

1 个答案:

答案 0 :(得分:2)

来电者可以选择T,而不是你。因此,您的类定义必须适用于所有T(满足类型边界,但所有T都是Any的子类型)。

也就是说,如果某人创造了例如一个MQTTDStream[String],然后其compute方法必须返回Option[RDD[String]]。但事实并非如此:它返回Some[RDD[Int]]