如果可能的话,我想为并行启动的任务创建一个异步枚举器。因此,首先要完成的是枚举的第一个元素,其次要完成的是枚举的第二个元素,等等。
public static async IAsyncEnumerable<T> ParallelEnumerateAsync(this IEnumerable<Task<T>> coldAsyncTasks)
{
// ...
}
我敢肯定有一种使用ContinueWith
和Queue<T>
的方法,但是我并不完全相信自己能够实现它。
答案 0 :(得分:1)
这是您要找的吗?
public static async IAsyncEnumerable<T> ParallelEnumerateAsync<T>(
this IEnumerable<Task<T>> tasks)
{
var remaining = new List<Task<T>>(tasks);
while (remaining.Count != 0)
{
var task = await Task.WhenAny(remaining);
remaining.Remove(task);
yield return (await task);
}
}
答案 1 :(得分:1)
如果我对您的问题理解正确,那么您的重点是启动所有任务,让它们全部并行运行,但请确保返回值的处理顺序与启动任务的顺序相同。
通过C# 8.0 Asynchronous Streams任务队列检查规格,执行 并行 ,但返回 顺序 可以看起来像这样。
async Task RunAsyncStreams()
{
await foreach (var n in RunAndPreserveOrderAsync(GenerateTasks(6)))
{
Console.WriteLine($"#{n} is returned");
}
}
IEnumerable<Task<int>> GenerateTasks(int count)
{
return Enumerable.Range(1, count).Select(async n =>
{
await Task.Delay(new Random().Next(100, 1000));
Console.WriteLine($"#{n} is complete");
return n;
});
}
async IAsyncEnumerable<int> RunAndPreserveOrderAsync(IEnumerable<Task<int>> tasks)
{
var queue = new Queue<Task<int>>(tasks);
while (queue.Count > 0) yield return await queue.Dequeue();
}
可能的输出:
#5 is complete
#1 is complete
#1 is returned
#3 is complete
#6 is complete
#2 is complete
#2 is returned
#3 is returned
#4 is complete
#4 is returned
#5 is returned
#6 is returned
实际上,似乎没有对该语言模式的任何新的语言级别支持,而且由于异步流处理IAsyncEnumerable<T>
,因此这意味着基本Task
将不能在这里工作,所有工作器async
方法都应具有相同的Task<T>
返回类型,这在某种程度上限制了基于异步流的设计。
因此,这取决于您的情况(您是否希望能够取消长时间运行的任务?是否需要按任务进行异常处理?并发任务的数量是否应受到限制?)查看@TheGeneral的建议。
答案 2 :(得分:1)
我承担这项任务。从本主题的其他答案中大量借用,但(希望)进行了一些增强。因此,想法是像其他答案一样启动任务并将其放入队列,但是像Theodor Zoulias一样,我也在尝试限制最大并行度。但是,我试图通过在任务中的任何一个完成后立即使用任务继续使下一个任务排队来克服他在评论中提到的限制。这样,我们当然可以在配置的限制内最大化同时运行的任务的数量。
我不是异步专家,该解决方案可能包含多线程死锁和其他Heisenbug,我未测试异常处理等,因此已警告您。
public static async IAsyncEnumerable<TResult> ExecuteParallelAsync<TResult>(IEnumerable<Task<TResult>> coldTasks, int degreeOfParallelism)
{
if (degreeOfParallelism < 1)
throw new ArgumentOutOfRangeException(nameof(degreeOfParallelism));
if (coldTasks is ICollection<Task<TResult>>) throw new ArgumentException(
"The enumerable should not be materialized.", nameof(coldTasks));
var queue = new ConcurrentQueue<Task<TResult>>();
using var enumerator = coldTasks.GetEnumerator();
for (var index = 0; index < degreeOfParallelism && EnqueueNextTask(); index++) ;
while (queue.TryDequeue(out var nextTask)) yield return await nextTask;
bool EnqueueNextTask()
{
lock (enumerator)
{
if (!enumerator.MoveNext()) return false;
var nextTask = enumerator.Current
.ContinueWith(t =>
{
EnqueueNextTask();
return t.Result;
});
queue.Enqueue(nextTask);
return true;
}
}
}
我们使用这种方法来生成测试任务(从DK的答案中借来了):
IEnumerable<Task<int>> GenerateTasks(int count)
{
return Enumerable.Range(1, count).Select(async n =>
{
Console.WriteLine($"#{n} started");
await Task.Delay(new Random().Next(100, 1000));
Console.WriteLine($"#{n} completed");
return n;
});
}
还有他(或她)的测试跑步者:
async void Main()
{
await foreach (var n in ExecuteParallelAsync(GenerateTasks(9),3))
{
Console.WriteLine($"#{n} returned");
}
}
我们在LinqPad中得到了这个结果(真棒,顺便说一句)
#1 started
#2 started
#3 started
#3 is complete
#4 started
#2 is complete
#5 started
#1 is complete
#6 started
#1 is returned
#2 is returned
#3 is returned
#4 is complete
#7 started
#4 is returned
#6 is complete
#8 started
#7 is complete
#9 started
#8 is complete
#5 is complete
#5 is returned
#6 is returned
#7 is returned
#8 is returned
#9 is complete
#9 is returned
请注意下一个任务一旦完成,下一个任务如何开始,以及它们返回的顺序仍然保留。
答案 3 :(得分:1)
如果您想获取异步流(IAsyncEnumerable)并并行运行Select
,那么第一个完成的人是第一个出现的人:
/// <summary>
/// Runs the selectors in parallel and yields in completion order
/// </summary>
public static async IAsyncEnumerable<TOut> SelectParallel<TIn, TOut>(
this IAsyncEnumerable<TIn> source,
Func<TIn, Task<TOut>> selector)
{
if (source == null)
{
throw new InvalidOperationException("Source is null");
}
var enumerator = source.GetAsyncEnumerator();
var sourceFinished = false;
var tasks = new HashSet<Task<TOut>>();
Task<bool> sourceMoveTask = null;
Task<Task<TOut>> pipeCompletionTask = null;
try
{
while (!sourceFinished || tasks.Any())
{
if (sourceMoveTask == null && !sourceFinished)
{
sourceMoveTask = enumerator.MoveNextAsync().AsTask();
}
if (pipeCompletionTask == null && tasks.Any())
{
pipeCompletionTask = Task.WhenAny<TOut>(tasks);
}
var coreTasks = new Task[] { pipeCompletionTask, sourceMoveTask }
.Where(t => t != null)
.ToList();
if (!coreTasks.Any())
{
break;
}
await Task.WhenAny(coreTasks);
if (sourceMoveTask != null && sourceMoveTask.IsCompleted)
{
sourceFinished = !sourceMoveTask.Result;
if (!sourceFinished)
{
try
{
tasks.Add(selector(enumerator.Current));
}
catch { }
}
sourceMoveTask = null;
}
if (pipeCompletionTask != null && pipeCompletionTask.IsCompleted)
{
var completedTask = pipeCompletionTask.Result;
if (completedTask.IsCompletedSuccessfully)
{
yield return completedTask.Result;
}
tasks.Remove(completedTask);
pipeCompletionTask = null;
}
}
}
finally
{
await enumerator.DisposeAsync();
}
}
可以如下使用:
static async Task Main(string[] args)
{
var source = GetIds();
var strs = source.SelectParallel(Map);
await foreach (var str in strs)
{
Console.WriteLine(str);
}
}
static async IAsyncEnumerable<int> GetIds()
{
foreach (var i in Enumerable.Range(1, 20))
{
await Task.Delay(200);
yield return i;
}
}
static async Task<string> Map(int id)
{
await Task.Delay(rnd.Next(1000, 2000));
return $"{id}_{Thread.CurrentThread.ManagedThreadId}";
}
可能的输出:
[6:31:03 PM] 1_5
[6:31:03 PM] 2_6
[6:31:04 PM] 3_6
[6:31:04 PM] 6_4
[6:31:04 PM] 5_4
[6:31:04 PM] 4_5
[6:31:05 PM] 8_6
[6:31:05 PM] 7_6
[6:31:05 PM] 11_6
[6:31:05 PM] 10_4
[6:31:05 PM] 9_6
[6:31:06 PM] 14_6
[6:31:06 PM] 12_4
[6:31:06 PM] 13_4
[6:31:06 PM] 15_4
[6:31:07 PM] 17_4
[6:31:07 PM] 20_4
[6:31:07 PM] 16_6
[6:31:07 PM] 18_6
[6:31:08 PM] 19_6
答案 4 :(得分:0)
这里是一个版本,还允许指定最大并行度。这个想法是任务被滞后枚举。例如,对于degreeOfParallelism: 4
,将立即枚举前4个任务,从而创建它们,然后等待其中的第一个。接下来,枚举第五项任务,等待第二项任务,依此类推。
为了保持整洁,Lag
方法作为静态local function(C#8的新功能)嵌入在ParallelEnumerateAsync
方法内部。
由于coldTasks
枚举的枚举很可能是由多个线程驱动的,因此使用线程安全包装程序对其进行枚举。
public static async IAsyncEnumerable<TResult> ParallelEnumerateAsync<TResult>(
this IEnumerable<Task<TResult>> coldTasks, int degreeOfParallelism)
{
if (degreeOfParallelism < 1)
throw new ArgumentOutOfRangeException(nameof(degreeOfParallelism));
if (coldTasks is ICollection<Task<TResult>>) throw new ArgumentException(
"The enumerable should not be materialized.", nameof(coldTasks));
foreach (var task in Safe(Lag(coldTasks, degreeOfParallelism - 1)))
{
yield return await task.ConfigureAwait(false);
}
static IEnumerable<T> Lag<T>(IEnumerable<T> source, int count)
{
var queue = new Queue<T>();
using (var enumerator = source.GetEnumerator())
{
int index = 0;
while (enumerator.MoveNext())
{
queue.Enqueue(enumerator.Current);
index++;
if (index > count) yield return queue.Dequeue();
}
}
while (queue.Count > 0) yield return queue.Dequeue();
}
static IEnumerable<T> Safe<T>(IEnumerable<T> source)
{
var locker = new object();
using (var enumerator = source.GetEnumerator())
{
while (true)
{
T item;
lock (locker)
{
if (!enumerator.MoveNext()) yield break;
item = enumerator.Current;
}
yield return item;
}
}
}
}