使用TPL Dataflow的请求/响应模式

时间:2018-05-01 16:36:16

标签: c# .net task-parallel-library tpl-dataflow

我们遇到一个问题,我们在使用TPL Dataflow库时需要一个请求/响应模式。我们的问题是我们有一个.NET核心API来调用依赖服务。从属服务限制并发请求。我们的API不限制并发请求;因此,我们一次可以收到数千个请求。在这种情况下,从属服务将在达到其限制后拒绝请求。因此,我们实现了BufferBlock和TransformBlock。性能稳固,效果很好。我们测试了我们的API前端,1000个用户发出100个请求/秒,出现0个问题。缓冲区块缓冲请求,并且变换块并行执行我们所需的请求量。依赖服务接收我们的请求并做出响应。我们在转换块操作中返回该响应,一切都很好。我们的问题是缓冲区块和变换块是断开连接的,这意味着请求/响应不同步。我们遇到的问题是请求会收到另一个请求者的响应(请参阅下面的代码)。

具体到下面的代码,我们的问题在于GetContent方法。该方法是从我们API中的服务层调用的,最终从我们的控制器调用。下面的代码和服务层是单例。缓冲区的sendasync与变换块receiveasync断开连接,以便返回任意响应,而不一定是已发出的请求。

所以,我们的问题是:有没有办法使用数据流块来关联请求/响应?最终目标是请求进入我们的API,发送到依赖服务,并返回给客户端。我们的数据流实现的代码如下。谢谢。

public class HttpClientWrapper : IHttpClientManager
{
    private readonly IConfiguration _configuration;
    private readonly ITokenService _tokenService;
    private HttpClient _client;

    private BufferBlock<string> _bufferBlock;
    private TransformBlock<string, JObject> _actionBlock;

    public HttpClientWrapper(IConfiguration configuration, ITokenService tokenService)
    {
        _configuration = configuration;
        _tokenService = tokenService;

        _bufferBlock = new BufferBlock<string>();

        var executionDataFlowBlockOptions = new ExecutionDataflowBlockOptions
        {
            MaxDegreeOfParallelism = 10
        };

        var dataFlowLinkOptions = new DataflowLinkOptions
        {
            PropagateCompletion = true
        };

        _actionBlock = new TransformBlock<string, JObject>(t => ProcessRequest(t), executionDataFlowBlockOptions);

        _bufferBlock.LinkTo(_actionBlock, dataFlowLinkOptions);
    }

    public void Connect()
    {
        _client = new HttpClient();

        _client.DefaultRequestHeaders.Add("x-ms-client-application-name", "ourappname");
    }

    public async Task<JObject> GetContent(string request)
    {
        await _bufferBlock.SendAsync(request);

        var result = await _actionBlock.ReceiveAsync();

        return result;
    }

    private async Task<JObject> ProcessRequest(string request)
    {
        if (_client == null)
        {
            Connect();
        }

        try
        {
            var accessToken = await _tokenService.GetTokenAsync(_configuration);

            var httpRequestMessage = new HttpRequestMessage(HttpMethod.Post, new Uri($"https://{_configuration.Uri}"));

            // add the headers
            httpRequestMessage.Headers.Add("Authorization", $"Bearer {accessToken}");
            // add the request body
            httpRequestMessage.Content = new StringContent(request, Encoding.UTF8, "application/json");

            var postRequest = await _client.SendAsync(httpRequestMessage);

            var response = await postRequest.Content.ReadAsStringAsync();

            return JsonConvert.DeserializeObject<JObject>(response);
        }
        catch(Exception ex)
        {
            // log error

            return new JObject();
        }            
    }
}

3 个答案:

答案 0 :(得分:1)

您需要做的是使用id标记每个传入的项目,以便您可以将数据输入与结果输出相关联。以下是如何执行此操作的示例:

namespace ConcurrentFlows.DataflowJobs {
    using System;
    using System.Collections.Concurrent;
    using System.Collections.Generic;
    using System.Threading.Tasks;
    using System.Threading.Tasks.Dataflow;

    /// <summary>
    /// A generic interface defining that:
    /// for a specified input type => an awaitable result is produced.
    /// </summary>
    /// <typeparam name="TInput">The type of data to process.</typeparam>
    /// <typeparam name="TOutput">The type of data the consumer expects back.</typeparam>
    public interface IJobManager<TInput, TOutput> {
        Task<TOutput> SubmitRequest(TInput data);
    }

    /// <summary>
    /// A TPL-Dataflow based job manager.
    /// </summary>
    /// <typeparam name="TInput">The type of data to process.</typeparam>
    /// <typeparam name="TOutput">The type of data the consumer expects back.</typeparam>
    public class DataflowJobManager<TInput, TOutput> : IJobManager<TInput, TOutput> {

        /// <summary>
        /// It is anticipated that jobHandler is an injected
        /// singleton instance of a Dataflow based 'calculator', though this implementation
        /// does not depend on it being a singleton.
        /// </summary>
        /// <param name="jobHandler">A singleton Dataflow block through which all jobs are processed.</param>
        public DataflowJobManager(IPropagatorBlock<KeyValuePair<Guid, TInput>, KeyValuePair<Guid, TOutput>> jobHandler) {
            if (jobHandler == null) { throw new ArgumentException("Argument cannot be null.", "jobHandler"); }

            this.JobHandler = JobHandler;
            if (!alreadyLinked) {
                JobHandler.LinkTo(ResultHandler, new DataflowLinkOptions() { PropagateCompletion = true });
                alreadyLinked = true;
            }
        }

        private static bool alreadyLinked = false;            

        /// <summary>
        /// Submits the request to the JobHandler and asynchronously awaits the result.
        /// </summary>
        /// <param name="data">The input data to be processd.</param>
        /// <returns></returns>
        public async Task<TOutput> SubmitRequest(TInput data) {
            var taggedData = TagInputData(data);
            var job = CreateJob(taggedData);
            Jobs.TryAdd(job.Key, job.Value);
            await JobHandler.SendAsync(taggedData);
            return await job.Value.Task;
        }

        private static ConcurrentDictionary<Guid, TaskCompletionSource<TOutput>> Jobs {
            get;
        } = new ConcurrentDictionary<Guid, TaskCompletionSource<TOutput>>();

        private static ExecutionDataflowBlockOptions Options {
            get;
        } = GetResultHandlerOptions();

        private static ITargetBlock<KeyValuePair<Guid, TOutput>> ResultHandler {
            get;
        } = CreateReplyHandler(Options);

        private IPropagatorBlock<KeyValuePair<Guid, TInput>, KeyValuePair<Guid, TOutput>> JobHandler {
            get;
        }

        private KeyValuePair<Guid, TInput> TagInputData(TInput data) {
            var id = Guid.NewGuid();
            return new KeyValuePair<Guid, TInput>(id, data);
        }

        private KeyValuePair<Guid, TaskCompletionSource<TOutput>> CreateJob(KeyValuePair<Guid, TInput> taggedData) {
            var id = taggedData.Key;
            var jobCompletionSource = new TaskCompletionSource<TOutput>();
            return new KeyValuePair<Guid, TaskCompletionSource<TOutput>>(id, jobCompletionSource);
        }

        private static ExecutionDataflowBlockOptions GetResultHandlerOptions() {
            return new ExecutionDataflowBlockOptions() {
                MaxDegreeOfParallelism = Environment.ProcessorCount,
                BoundedCapacity = 1000
            };
        }

        private static ITargetBlock<KeyValuePair<Guid, TOutput>> CreateReplyHandler(ExecutionDataflowBlockOptions options) {
            return new ActionBlock<KeyValuePair<Guid, TOutput>>((result) => {
                RecieveOutput(result);
            }, options);
        }

        private static void RecieveOutput(KeyValuePair<Guid, TOutput> result) {
            var jobId = result.Key;
            TaskCompletionSource<TOutput> jobCompletionSource;
            if (!Jobs.TryRemove(jobId, out jobCompletionSource)) {
                throw new InvalidOperationException($"The jobId: {jobId} was not found.");
            }
            var resultValue = result.Value;
            jobCompletionSource.SetResult(resultValue);            
        }
    }
}

另见this answer以供参考。

答案 1 :(得分:1)

对于TPL数据流库来说,进行简单的调节并不是一个特别诱人的用例,而使用SemaphoreSlim似乎更简单且更具吸引力。但是,如果您需要更多功能,例如为每个请求强制设置最短持续时间,或者有一种方法等待所有待处理的请求完成,那么TPL数据流可以提供SemaphoreSlim无法提供的功能。基本思想是避免将裸露的输入值传递给块,并稍后尝试将它们与产生的结果相关联。可以根据请求立即创建任务,将任务发送到ActionBlock<Task>,然后使用指定的await让块激活并MaxDegreeOfParallelism异步执行这些任务,这是更加安全的方法。这样,输入值及其结果将永远被明确地绑在一起。

public class ThrottledExecution<T>
{
    private readonly ActionBlock<Task<Task<T>>> _actionBlock;
    private readonly CancellationToken _cancellationToken;

    public ThrottledExecution(int concurrencyLevel, int minDurationMilliseconds = 0,
        CancellationToken cancellationToken = default)
    {
        if (minDurationMilliseconds < 0) throw new ArgumentOutOfRangeException();
        _actionBlock = new ActionBlock<Task<Task<T>>>(async task =>
        {
            try
            {
                var delay = Task.Delay(minDurationMilliseconds, cancellationToken);
                task.RunSynchronously();
                await task.Unwrap().ConfigureAwait(false);
                await delay.ConfigureAwait(false);
            }
            catch { } // Ignore exceptions (errors are propagated through the task)
        }, new ExecutionDataflowBlockOptions()
        {
            MaxDegreeOfParallelism = concurrencyLevel,
            CancellationToken = cancellationToken,
        });
        _cancellationToken = cancellationToken;
    }

    public Task<T> Run(Func<Task<T>> function)
    {
        // Create a cold task (the function will be invoked later)
        var task = new Task<Task<T>>(function, _cancellationToken);
        var accepted = _actionBlock.Post(task);
        _cancellationToken.ThrowIfCancellationRequested();
        if (!accepted) throw new InvalidOperationException(
            "The component has been marked as complete.");
        return task.Unwrap();
    }

    public void Complete() => _actionBlock.Complete();
    public Task Completion => _actionBlock.Completion;
}

用法示例:

private ThrottledExecution<JObject> throttledExecution
    = new ThrottledExecution<JObject>(concurrencyLevel: 10);

public Task<JObject> GetContent(string request)
{
    return throttledExecution.Run(() => ProcessRequest(request));
}

答案 2 :(得分:0)

我很欣赏JSteward提供的答案。他是一个完全可以接受的方法;但是,我最终通过使用SemaphoreSlim来做到这一点。 SemaphoreSlim提供了两个允许它成为强大解决方案的东西。首先,它提供了一个构造函数重载,您可以在其中发送计数。此计数指的是能够通过信号量等待机制的并发项数。等待机制由名为WaitAsync的方法提供。使用下面的方法,其中Worker类作为Singleton,并发请求进入,一次执行http请求限制为10,并且响应全部返回到正确的请求。因此,实现可能如下所示:

public class Worker: IWorker
{
    private readonly IHttpClientManager _httpClient;
    private readonly ITokenService _tokenService;

    private readonly SemaphoreSlim _semaphore;

    public Worker(IHttpClientManager httpClient, ITokenService tokenService)
    {
        _httpClient = httpClient;
        _tokenService = tokenService;

        // we want to limit the number of items here
        _semaphore = new SemaphoreSlim(10);
    }

    public async Task<JObject> ProcessRequestAsync(string request, string route)
    {
        try
        {
            var accessToken = await _tokenService.GetTokenAsync(
                _timeSeriesConfiguration.TenantId,
                _timeSeriesConfiguration.ClientId,
                _timeSeriesConfiguration.ClientSecret);

            var cancellationToken = new CancellationTokenSource();

            cancellationToken.CancelAfter(30000);

            await _semaphore.WaitAsync(cancellationToken.Token);
            var httpResponseMessage = await _httpClient.SendAsync(new HttpClientRequest
            {
                Method = HttpMethod.Post,
                Uri = $"https://someuri/someroute",
                Token = accessToken,
                Content = request
            });

            var response = await httpResponseMessage.Content.ReadAsStringAsync();

            return response;
        }
        catch (Exception ex)
        {
            // do some logging

            throw;
        }
        finally
        {
            _semaphore.Release();
        }
    }
}