使用反应堆的Flux.buffer批处理工作仅适用于单个项目

时间:2019-03-15 17:43:29

标签: java kotlin project-reactor reactive-streams

我正在尝试使用Flux.buffer()来批量加载数据库中的负载。

用例是从数据库中加载记录可能是“突发的”,我想引入一个小缓冲区,以便在可能的情况下将加载分组在一起。

我的概念性方法是使用某种形式的处理器,发布到它的接收器,让该缓冲区,然后订阅并过滤我想要的结果。

我尝试了多种不同的方法(不同类型的处理器,以不同的方式创建过滤后的Mono)。

下面是我到目前为止所能到达的地方-主要是因为绊脚石。

当前,这将返回一个结果,但是后续的调用被丢弃(尽管我不确定在哪里)。

class BatchLoadingRepository {
    // I've tried all manner of different processors here.  I'm unsure if
    // TopicProcessor is the correct one to use.
    private val bufferPublisher = TopicProcessor.create<String>()
    private val resultsStream = bufferPublisher
            .bufferTimeout(50, Duration.ofMillis(50))
            // I'm unsure if concatMapIterable is the correct operator here, 
            // but it seems to work.
            // I'm really trying to turn the List<MyEntity> 
            // into a stream of MyEntity, published on the Flux<>
            .concatMapIterable { requestedIds ->
                // this is a Spring Data repository.  It returns List<MyEntity>
                repository.findAllById(requestedIds)
            }

    // Multiple callers will invoke this method, and then subscribe to receive
    // their entity back.
    fun findByIdAsync(id: String): Mono<MyEntity> {

        // Is there a potential race condition here, caused by a result
        // on the resultsStream, before I've subscribed?
        return Mono.create<MyEntity> { sink ->
            bufferPublisher.sink().next(id)
            resultsStream.filter { it.id == id }
                    .subscribe { next ->
                        sink.success(next)
                    }
        }
    }
}

1 个答案:

答案 0 :(得分:0)

嗨,我正在测试您的代码,我认为最好的方法是使用EmitterProcessor共享。我已经使用了generatorProcessor进行了测试,看来工作正常

 Flux<String> fluxi;
EmitterProcessor emitterProcessor;

@Override
public void run(String... args) throws Exception {
    emitterProcessor = EmitterProcessor.create();

    fluxi = emitterProcessor.share().bufferTimeout(500, Duration.ofMillis(500))
            .concatMapIterable(o -> o);
;

    Flux.range(0,1000)
            .flatMap(integer -> findByIdAsync(integer.toString()))
            .map(s -> {
                System.out.println(s);
                return s;
            }).subscribe();

}

private Mono<String> findByIdAsync(String id) {
    return Mono.create(monoSink -> {
        fluxi.filter(s -> s == id).subscribe(value -> monoSink.success(value));
        emitterProcessor.onNext(id);
    });
}