ReceiveAsync中断/断开消息传递

时间:2018-09-07 09:05:56

标签: c# .net tpl-dataflow

尝试实施针对this problem的建议解决方案时,这个问题引起了人们的注意。

问题摘要

执行从TransformBlock到WriteOnceBlock的ReceiveAsync()调用会导致TransformBlock本质上将自己从流中删除。它会停止传播任何类型的消息,无论是数据消息还是完成信号。

系统设计

该系统旨在通过一系列步骤来解析大型CSV文件。

流程中有问题的部分可以(不熟练)可视化如下:

Partial data flow

平行四边形是一个BufferBlock,菱形是BroadcastBlocks,三角形是WriteOnceBlocks,箭头是TransformBlocks。实线表示使用LinkTo()创建的链接,虚线表示从ParsedHeaderAndRecordJoiner到ParsedHeaderContainer块的ReceiveAsync()调用。我知道这种流程有些次优,但这不是问题的主要原因。

代码

应用程序根

这是该类的一部分,该类创建必要的块并使用PropagateCompletion将它们链接在一起

using (var cancellationSource = new CancellationTokenSource())
{
    var cancellationToken = cancellationSource.Token;
    var temporaryEntityInstance = new Card(); // Just as an example

    var producerQueue = queueFactory.CreateQueue<string>(new DataflowBlockOptions{CancellationToken = cancellationToken});
    var recordDistributor = distributorFactory.CreateDistributor<string>(s => (string)s.Clone(), 
        new DataflowBlockOptions { CancellationToken = cancellationToken });
    var headerRowContainer = containerFactory.CreateContainer<string>(s => (string)s.Clone(), 
        new DataflowBlockOptions { CancellationToken = cancellationToken });
    var headerRowParser = new HeaderRowParserFactory().CreateHeaderRowParser(temporaryEntityInstance.GetType(), ';', 
        new ExecutionDataflowBlockOptions { CancellationToken = cancellationToken });
    var parsedHeaderContainer = containerFactory.CreateContainer<HeaderParsingResult>(HeaderParsingResult.Clone, 
        new DataflowBlockOptions { CancellationToken = cancellationToken});
    var parsedHeaderAndRecordJoiner = new ParsedHeaderAndRecordJoinerFactory().CreateParsedHeaderAndRecordJoiner(parsedHeaderContainer, 
        new ExecutionDataflowBlockOptions { CancellationToken = cancellationToken });
    var entityParser = new entityParserFactory().CreateEntityParser(temporaryEntityInstance.GetType(), ';',
        dataflowBlockOptions: new ExecutionDataflowBlockOptions { CancellationToken = cancellationToken });
    var entityDistributor = distributorFactory.CreateDistributor<EntityParsingResult>(EntityParsingResult.Clone, 
        new DataflowBlockOptions{CancellationToken = cancellationToken});

    var linkOptions = new DataflowLinkOptions {PropagateCompletion = true};

    // Producer subprocess
    producerQueue.LinkTo(recordDistributor, linkOptions);

    // Header subprocess
    recordDistributor.LinkTo(headerRowContainer, linkOptions);
    headerRowContainer.LinkTo(headerRowParser, linkOptions);
    headerRowParser.LinkTo(parsedHeaderContainer, linkOptions);
    parsedHeaderContainer.LinkTo(errorQueue, new DataflowLinkOptions{MaxMessages = 1, PropagateCompletion = true}, dataflowResult => !dataflowResult.WasSuccessful);

    // Parsing subprocess
    recordDistributor.LinkTo(parsedHeaderAndRecordJoiner, linkOptions);
    parsedHeaderAndRecordJoiner.LinkTo(entityParser, linkOptions, joiningResult => joiningResult.WasSuccessful);
    entityParser.LinkTo(entityDistributor, linkOptions);
    entityDistributor.LinkTo(errorQueue, linkOptions, dataflowResult => !dataflowResult.WasSuccessful);
}

HeaderRowParser

此块从CSV文件解析标题行并进行一些验证。

public class HeaderRowParserFactory
{
    public TransformBlock<string, HeaderParsingResult> CreateHeaderRowParser(Type entityType,
        char delimiter,
        ExecutionDataflowBlockOptions dataflowBlockOptions = null)
    {
        return new TransformBlock<string, HeaderParsingResult>(headerRow =>
        {
            // Set up some containers
            var result = new HeaderParsingResult(identifier: "N/A", wasSuccessful: true);
            var fieldIndexesByPropertyName = new Dictionary<string, int>();

            // Get all serializable properties on the chosen entity type
            var serializableProperties = entityType.GetProperties()
                .Where(prop => prop.IsDefined(typeof(CsvFieldNameAttribute), false))
                .ToList();

            // Add their CSV fieldnames to the result
            var entityFieldNames = serializableProperties.Select(prop => prop.GetCustomAttribute<CsvFieldNameAttribute>().FieldName);
            result.SetEntityFieldNames(entityFieldNames);

            // Create the dictionary of properties by field name
            var serializablePropertiesByFieldName = serializableProperties.ToDictionary(prop => prop.GetCustomAttribute<CsvFieldNameAttribute>().FieldName, prop => prop, StringComparer.OrdinalIgnoreCase);

            var fields = headerRow.Split(delimiter);

            for (var i = 0; i < fields.Length; i++)
            {
                // If any field in the CSV is unknown as a serializable property, we return a failed result
                if (!serializablePropertiesByFieldName.TryGetValue(fields[i], out var foundProperty))
                {
                    result.Invalidate($"The header row contains a field that does not match any of the serializable properties - {fields[i]}.",
                        DataflowErrorSeverity.Critical);
                    return result;
                }

                // Perform a bunch more validation

                fieldIndexesByPropertyName.Add(foundProperty.Name, i);
            }

            result.SetFieldIndexesByName(fieldIndexesByPropertyName);
            return result;
        }, dataflowBlockOptions ?? new ExecutionDataflowBlockOptions());
    }
}

ParsedHeaderAndRecordJoiner

对于通过管道传递的每个后续记录,此块旨在检索已解析的标头数据并将其添加到记录中。

public class ParsedHeaderAndRecordJoinerFactory
{
    public TransformBlock<string, HeaderAndRecordJoiningResult> CreateParsedHeaderAndRecordJoiner(WriteOnceBlock<HeaderParsingResult> parsedHeaderContainer, 
        ExecutionDataflowBlockOptions dataflowBlockOptions = null)
    {
        return new TransformBlock<string, HeaderAndRecordJoiningResult>(async csvRecord =>
            {
                var headerParsingResult = await parsedHeaderContainer.ReceiveAsync();

                // If the header couldn't be parsed, a critical error is already on its way to the failure logger so we don't need to continue
                if (!headerParsingResult.WasSuccessful) return new HeaderAndRecordJoiningResult(identifier: "N.A.", wasSuccessful: false, null, null);

                // The entity parser can't do anything with the header record, so we send a message with wasSuccessful false
                var isHeaderRecord = true;
                foreach (var entityFieldName in headerParsingResult.EntityFieldNames)
                {
                    isHeaderRecord &= csvRecord.Contains(entityFieldName);
                }
                if (isHeaderRecord) return new HeaderAndRecordJoiningResult(identifier: "N.A.", wasSuccessful: false, null, null);

                return new HeaderAndRecordJoiningResult(identifier: "N.A.", wasSuccessful: true, headerParsingResult, csvRecord);
            }, dataflowBlockOptions ?? new ExecutionDataflowBlockOptions());
    }
}

问题详细信息

在当前实现中,ParsedHeaderAndRecordJoiner从对ParsedHeaderContainer的ReceiveAsync()调用正确接收数据,并按预期返回,但是没有消息到达EntityParser。

此外,当将Complete信号发送到流的前端(ProducerQueue)时,它会传播到RecordDistributor,然后在ParsedHeaderAndRecordJoiner处停止(它确实从HeaderRowContainer向前继续,因此RecordDistributor继续传递该信号)。 。

如果我删除ReceiveAsync()调用并自己模拟数据,则该块将按预期方式运行。

1 个答案:

答案 0 :(得分:1)

我认为这是关键

  

但是没有消息到达EntityParser。

根据示例,EntityParser返回false时,ParsedHeaderAndRecordJoiner不接收WasSuccessful输出的消息的唯一方法。链接中使用的谓词排除了失败的消息,但是这些消息无处可去,因此它们会堆积在ParsedHeaderAndRecordJoiner输出缓冲区中,并且还会阻止Completion传播。您需要链接一个空目标以转储失败的消息。

parsedHeaderAndRecordJoiner.LinkTo(DataflowBlock.NullTarget<HeaderParsingResult>());

如果您的模拟数据总是返回WasSuccessful为真,则可能会将您指向await ...ReceiveAsync()

不一定是吸烟枪,而是一个很好的起点。当管道卡住时,是否可以确认ParsedHeaderAndRecordJoiner的输出缓冲区中所有消息的状态。