数组扩展,其中Element为可选,返回类型为Wrapped

时间:2019-02-09 11:25:11

标签: arrays swift generics optional

我想创建一个数组扩展,其中数组的Element是可选的,方法的返回类型是非可选的Element Type。

有可能,如果可以,语法是什么?

主要思想是使用伪代码:

extension Array where Element: Optional {
  func foo() -> ReturnType<Wrapped<Element>> {
    ...
  }
}

1 个答案:

答案 0 :(得分:1)

我不确定scala> val df = Seq(("asdc24","conn",1), | ("asdc24","conn",2), | ("asdc24","conn",5), | ("adsfa6","conn",1), | ("adsfa6","conn",3), | ("asdc24","conn",9), | ("adsfa6","conn",5), | ("asdc24","conn",11), | ("adsfa6","conn",10), | ("asdc24","conn",15)).toDF("user_id","action","day") df: org.apache.spark.sql.DataFrame = [user_id: string, action: string ... 1 more field] scala> df.orderBy('user_id,'day).show(false) +-------+------+---+ |user_id|action|day| +-------+------+---+ |adsfa6 |conn |1 | |adsfa6 |conn |3 | |adsfa6 |conn |5 | |adsfa6 |conn |10 | |asdc24 |conn |1 | |asdc24 |conn |2 | |asdc24 |conn |5 | |asdc24 |conn |9 | |asdc24 |conn |11 | |asdc24 |conn |15 | +-------+------+---+ scala> df.createOrReplaceTempView("qubix") scala> spark.sql(""" with t1 (select user_id,action, day,lead(day) over(partition by user_id order by day) ld from qubix), t2 (select user_id fro m t1 where ld-t1.day=1 ) select * from qubix where user_id in (select user_id from t2) """).show(false) +-------+------+---+ |user_id|action|day| +-------+------+---+ |asdc24 |conn |1 | |asdc24 |conn |2 | |asdc24 |conn |5 | |asdc24 |conn |9 | |asdc24 |conn |11 | |asdc24 |conn |15 | +-------+------+---+ scala> 的含义,但是由于您需要返回某些内容,所以为什么不使用闭包作为返回值,例如使用此函数来使元素位于特定索引处

Wrapped<Element>

示例

extension Array  {
    func valueAt<T>(_ index: Int, emptyAction: () -> T) -> T where Element == T? {
        if let value = self[index] {
            return value
        }
        return emptyAction()
    }
}

输出

var test = [String?]()
test.append("ABC")
test.append("DEF")
test.append(nil)

for i in 0..<test.count {
    print(test.valueAt(i, emptyAction: { "<empty>" }))
}