假设我有两个可观测量。
第一个observable是一系列列表:
[
{id: 'zzz', other props here...},
{id: 'aaa', ...},
{id: '007', ...}
... and more over time
]
第二个observable是一系列被忽略的列表:
[
{id: '007'}, // only id, no other props
{id: 'zzz'}
... and more over time
]
结果应该是一个新的可观察列表(第一个可观察的),但不得有任何被忽略的列表:
[
{id: 'aaa', other props here...}
... and more over time
]
这是我发布之前的内容:
obs2.pipe(withLatestFrom(obs1, ? => ?, filter(?));
答案 0 :(得分:1)
我没有测试过,但我认为应该没问题:
combineLatest(values$, excluded$).pipe(
map(([values, excluded]) => {
// put all the excluded IDs into a map for better perfs
const excludedIds: Map<string, undefined> = excluded.reduce(
(acc: Map<string, undefined>, item) => {
acc.set(item.id, undefined)
return acc;
},
new Map()
);
// filter the array, by looking up if the current
// item.id is in the excluded list or not
return values.filter(item => !excludedIds.has(item.id))
})
)
<强>解释强>
使用combineLatest
无论您从哪里获得更新,都会始终收到警告。如果您在示例中使用withLatestFrom
,则只有在values$
observable更新时才会触发更新。但是,如果excluded$
发生变化,则不会在您的情况下触发更新。
然后将所有排除的ID放入地图而不是数组中,因为我们需要知道是否应该排除给定的ID。查看地图比查看数组要快得多。
然后只过滤值数组。
答案 1 :(得分:1)
如果我理解正确,你要做的是
鉴于上述情况,下面是您可以尝试的一个粗略示例。正如底部所指出的,根据前两个流的节奏,你会得到不同的结果,因为,这就是async所发生的情况。为了证明这一点,我正在模拟随着时间推移事物的随机延迟。
希望这有帮助!
P.S。:下面是Typescript,假设rxjs @ ^ 6。
import { BehaviorSubject, combineLatest, of, Observable } from "rxjs";
import { delay, map, scan, concatMap } from "rxjs/operators";
/**
* Data sources
*/
// Just for showcase purposes... Simulates items emitted over time
const simulatedEmitOverTime = <T>() => (source: Observable<T>) =>
source.pipe(
concatMap(thing => of(thing).pipe(delay(Math.random() * 1000)))
);
interface Thing {
id: string;
}
// Stream of things over time
const thingsOverTime$ = of(
{ id: "zzz" },
{ id: "aaa" },
{ id: "007" }
).pipe(
simulatedEmitOverTime()
);
// Stream of ignored things over time
const ignoredThingsOverTime$ = of(
{ id: "007" },
{ id: "zzz" }
).pipe(
simulatedEmitOverTime()
);
/**
* Somewhere in your app
*/
// Aggregate incoming things
// `scan` takes a reducer-type function
const aggregatedThings$ = thingsOverTime$.pipe(
scan(
(aggregatedThings: Thing[], incomingThing: Thing) =>
aggregatedThings.concat(incomingThing),
[]
)
);
// Create a Set from incoming ignored thing ids
// A Set will allow for easy filtering over time
const ignoredIds$ = ignoredThingsOverTime$.pipe(
scan(
(excludedIdSet, incomingThing: Thing) =>
excludedIdSet.add(incomingThing.id),
new Set<string>()
)
);
// Combine stream and then filter out ignored ids
const sanitizedThings$ = combineLatest(aggregatedThings$, ignoredIds$)
.pipe(
map(([things, ignored]) => things.filter(({ id }) => !ignored.has(id)))
);
// Subscribe where needed
// Note: End result will vary depending on the timing of items coming in
// over time (which is being simulated here-ish)
sanitizedThings$.subscribe(console.log);