我试图在类型参数中使用复合类型编写一些函数:
trait Contains[T]
trait Sentence
trait Token
def sentenceSegmenter[T] = (c: Contains[T]) => null: Contains[T with Sentence]
def tokenizer[T <: Sentence] = (c: Contains[T]) => null: Contains[T with Token]
我的主要目标是能够用以下简单的东西来构建它们:
val pipeline = sentenceSegmenter andThen tokenizer
然而,这会产生编译错误,因为Scala推断tokenizer
的类型需要Contains[? with Sentence] => ?
:
scala> val pipeline = sentenceSegmenter andThen tokenizer
<console>:12: error: polymorphic expression cannot be instantiated to expected type;
found : [T <: Sentence]Contains[T] => Contains[T with Token]
required: Contains[? with Sentence] => ?
val pipeline = sentenceSegmenter andThen tokenizer
^
我尝试了一个稍微不同的tokenizer
定义,它与Scala推断的类型更匹配,但我得到了类似的错误:
scala> def tokenizer[T] = (c: Contains[T with Sentence]) => null: Contains[T with Sentence with Token]
tokenizer: [T]=> Contains[T with Sentence] => Contains[T with Sentence with Token]
scala> val pipeline = sentenceSegmenter andThen tokenizer
<console>:12: error: polymorphic expression cannot be instantiated to expected type;
found : [T]Contains[T with Sentence] => Contains[T with Sentence with Token]
required: Contains[? with Sentence] => ?
val pipeline = sentenceSegmenter andThen tokenizer
^
如果我在sentenceSegmenter
中指定了几乎任何类型,或者如果我创建了一个没有类型参数的虚假初始函数,我可以编译:
scala> val pipeline = sentenceSegmenter[Nothing] andThen tokenizer
pipeline: Contains[Nothing] => Contains[Nothing with Sentence with Sentence with Token] = <function1>
scala> val pipeline = sentenceSegmenter[Any] andThen tokenizer
pipeline: Contains[Any] => Contains[Any with Sentence with Sentence with Token] = <function1>
scala> val begin = identity[Contains[Any]] _
begin: Contains[Any] => Contains[Any] = <function1>
scala> val pipeline = begin andThen sentenceSegmenter andThen tokenizer
pipeline: Contains[Any] => Contains[Any with Sentence with Sentence with Token] = <function1>
我不介意推断类型Any
或Nothing
,因为我并不十分关心T
是什么。 (我主要关心的是with XXX
部分。)但是我希望推断它,而不是明确地指定它,或者通过虚假的初始函数提供它。
答案 0 :(得分:1)
您无法绑定(val
)类型参数。您必须使用def
,以便在使用时绑定类型:
def pipeline[T] = sentenceSegmenter[T] andThen tokenizer
请注意,您可以使用推断类型调用管道:
scala> new Contains[Sentence] {}
res1: Contains[Sentence] = $anon$1@5aea1d29
scala> pipeline(res1)
res2: Contains[Sentence with Sentence with Token] = null