如何在FS2中使用分类器功能对对象进行分组?

时间:2018-06-23 16:53:45

标签: scala functional-programming stream fs2

我有一个无序的measurements流,我希望将其分组为固定大小的批次,以便以后可以有效地持久保存它们:

val measurements = for {
  id <- Seq("foo", "bar", "baz")
  value <- 1 to 5
} yield (id, value)

fs2.Stream.emits(scala.util.Random.shuffle(measurements)).toVector

也就是说,代替:

(bar,4)
(foo,5)
(baz,3)
(baz,5)
(baz,4)
(foo,2)
(bar,2)
(foo,4)
(baz,1)
(foo,1)
(foo,3)
(bar,1)
(bar,5)
(bar,3)
(baz,2)

我希望具有以下结构,批量大小等于3

(bar,[4,2,1])
(foo,[5,2,4])
(baz,[3,5,4])
(baz,[1,2])
(foo,[1,3])
(bar,[5,3])

在FS2中是否有一种简单的惯用方式来实现这一目标?我知道有一个groupAdjacentBy函数,但这将仅考虑相邻项。

此刻我在0.10.5上。

1 个答案:

答案 0 :(得分:0)

这可以通过fs2 Pull实现:

import cats.data.{NonEmptyList => Nel}
import fs2._

object GroupingByKey {
  def groupByKey[F[_], K, V](limit: Int): Pipe[F, (K, V), (K, Nel[V])] = {
    require(limit >= 1)

    def go(state: Map[K, List[V]]): Stream[F, (K, V)] => Pull[F, (K, Nel[V]), Unit] = _.pull.uncons1.flatMap {
      case Some(((key, num), tail)) =>
        val prev = state.getOrElse(key, Nil)
        if (prev.size == limit - 1) {
          val group = Nel.ofInitLast(prev.reverse, num)
          Pull.output1(key -> group) >> go(state - key)(tail)
        } else {
          go(state.updated(key, num :: prev))(tail)
        }
      case None =>
        val chunk = Chunk.vector {
          state
            .toVector
            .collect { case (key, last :: revInit) =>
              val group = Nel.ofInitLast(revInit.reverse, last)
              key -> group
            }
        }
        Pull.output(chunk) >> Pull.done
    }

    go(Map.empty)(_).stream
  }
}

用法:

import cats.data.{NonEmptyList => Nel}
import cats.implicits._
import cats.effect.{ExitCode, IO, IOApp}
import fs2._

object Answer extends IOApp {
  type Key = String

  override def run(args: List[String]): IO[ExitCode] = {
    require {
      Stream('a -> 1).through(groupByKey(2)).compile.toList ==
        List('a -> Nel.one(1))
    }

    require {
      Stream('a -> 1, 'a -> 2).through(groupByKey(2)).compile.toList ==
        List('a -> Nel.of(1, 2))
    }

    require {
      Stream('a -> 1, 'a -> 2, 'a -> 3).through(groupByKey(2)).compile.toList ==
        List('a -> Nel.of(1, 2), 'a -> Nel.one(3))
    }

    val infinite = (for {
      prng <- Stream.eval(IO { new scala.util.Random() })
      keys <- Stream(Vector[Key]("a", "b", "c", "d", "e", "f", "g"))
      key = Stream.eval(IO {
        val i = prng.nextInt(keys.size)
        keys(i)
      })
      num = Stream.eval(IO { 1 + prng.nextInt(9) })
    } yield (key zip num).repeat).flatten

    infinite
      .through(groupByKey(3))
      .showLinesStdOut
      .compile
      .drain
      .as(ExitCode.Success)
  }
}