我正在尝试创建将在Spark DataFrame上运行的方法的Seq。目前,我正在运行时明确创建此Seq:
val allFuncs: Seq[DataFrame => DataFrame] = Seq(func1, func2, func3)
def func1(df: DataFrame): DataFrame = {}
def func2(df: DataFrame): DataFrame = {}
def func3(df: DataFrame): DataFrame = {}
我添加了允许开发人员添加注释的功能,并像这样通过它创建了一个MethodMirrors序列,但我希望getMyFuncs
返回一个Seq[(DataFrame => DataFrame)]
:
def getMyFuncs(): Seq[(DataFrame => DataFrame)] = {
// Gets anything with the @MyFunc annotation
val listOfAnnotations = typeOf[T].members.flatMap(f => f.annotations.find(_.tree.tpe =:= typeOf[MyFunc]).map((f, _))).toList
val rm = runtimeMirror(this.getClass.getClassLoader)
val instanceMirror = rm.reflect(this)
listOfAnnotations.map(annotation => instanceMirror.reflectMethod(annotation._1.asMethod)).toSeq
}
@MyFunc
def func1(df: DataFrame): DataFrame = {}
@MyFunc
def func2(df: DataFrame): DataFrame = {}
@MyFunc
def func3(df: DataFrame): DataFrame = {}
但是,getMyFuncs
返回的Seq是Seq[reflect.runtime.universe.MethodMirror]
,而不是Seq[(DataFrame => DataFrame)]
。这是预期的,但不是我需要的输出。有什么方法可以将MethodMirrors转换为Scala函数吗?
答案 0 :(得分:2)
尝试映射:
val getMyFuncs: Seq[reflect.runtime.universe.MethodMirror] = ???
val getMyFuncs1: Seq[DataFrame => DataFrame] =
getMyFuncs.map(mirror => (dataFrame: DataFrame) => mirror(dataFrame).asInstanceOf[DataFrame])
即使用reflect.runtime.universe.MethodMirror#apply(..)
手动创建Lambda。