我试图在scala中扩展累加器类,但它在构造函数
上失败这是IntelliJ的错误:
错误:(44,24)重载方法构造函数Accumulator with 替代方案:(initialValue: org.apache.spark.AccumulatorParam [T],PARAM: org.apache.spark.AccumulatorParam [org.apache.spark.AccumulatorParam [T]])org.apache.spark.Accumulator [org.apache.spark.AccumulatorParam [T]]` (initialValue:org.apache.spark.AccumulatorParam [T],param: org.apache.spark.AccumulatorParam [org.apache.spark.AccumulatorParam [T]],名称: 选项[字符串])org.apache.spark.Accumulator [org.apache.spark.AccumulatorParam [T]] 不能应用于()类MyAccumulator [T](initialValue: org.apache.spark.AccumulatorParam [T],PARAM: org.apache.spark.AccumulatorParam [org.apache.spark.AccumulatorParam [T]])
这是我的代码:
class MyAccumulator[T] (initialValue: AccumulatorParam[T],
param: AccumulatorParam[AccumulatorParam[T]])
extends Accumulator[AccumulatorParam[T]] with Serializable {
def this(initialValue: AccumulatorParam[T],
param: AccumulatorParam[org.apache.spark.AccumulatorParam[T]],
name: Option[String]) = {
this(initialValue,param,name)
}
override def setValue(newValue: AccumulatorParam[T]): Unit = super.setValue(newValue)
override val id: Long = ???
override val zero: AccumulatorParam[T] = ???
override def +=(term: AccumulatorParam[T]): Unit = super.+=(term)
override def add(term: AccumulatorParam[T]): Unit = super.add(term)
override def ++=(term: AccumulatorParam[T]): Unit = super.++=(term)
override def merge(term: AccumulatorParam[T]): Unit = super.merge(term)
override def localValue: AccumulatorParam[T] = super.localValue
override def value: AccumulatorParam[T] = super.value
override def value_=(newValue: AccumulatorParam[T]): Unit = super.value_=(newValue)
override def toString(): String = super.toString()
}
由于scala在声明类属性时创建了它自己的构造函数,我不明白为什么它认为它有一个空的构造函数,所以我试图声明一个构造函数并且它失败了与额外的重复构造函数相同的事情。
我是scala和spark的新手,请协助!
答案 0 :(得分:2)
您需要通过primary constructor将参数传递给Accumulator
类:
class MyAccumulator[T] (initialValue: AccumulatorParam[T],
param: org.apache.spark.AccumulatorParam[AccumulatorParam[T]])
extends Accumulator[AccumulatorParam[T]](initialValue, param) with Serializable
错误消息:
org.apache.spark.Accumulator [org.apache.spark.AccumulatorParam [T]] 不能应用于()
是因为Scala编译器会尝试找到一个不带参数的合适构造函数(arity-0),但不会,因此它不能将它应用于()
。