如何提高通过流下载大型天蓝色blob文件的性能?

时间:2020-03-26 19:30:20

标签: c# .net azure .net-core azure-storage-blobs

我有大约212 MB的JSON blob文件。
在本地调试时,大约需要15分钟才能下载。
当我将代码部署到Azure应用服务时,它将运行10分钟并失败,并显示以下错误:(在本地它会间歇性地失败,并显示相同的错误)

服务器无法验证请求。确保值 授权标头的格式正确,包括签名

代码尝试1:

// Create SAS Token for referencing a file for a duration of 5 min
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy
{
    SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15),
    Permissions = SharedAccessBlobPermissions.Read
};

var blob = cloudBlobContainer.GetBlockBlobReference(blobFilePath);
string sasContainerToken = blob.GetSharedAccessSignature(sasConstraints);

var cloudBlockBlob = new CloudBlockBlob(new Uri(blob.Uri + sasContainerToken));

using (var stream = new MemoryStream())
{
     await cloudBlockBlob.DownloadToStreamAsync(stream);
    //resetting stream's position to 0

    stream.Position = 0;
    var serializer = new JsonSerializer();

    using (var sr = new StreamReader(stream))
    {
        using (var jsonTextReader = new JsonTextReader(sr))
        {
            jsonTextReader.SupportMultipleContent = true;
            result = new List<T>();
            while (jsonTextReader.Read())
            {
                result.Add(serializer.Deserialize<T>(jsonTextReader));
            }
        }
    }
}

代码尝试2:我尝试使用DownloadRangeToStreamAsync下载块中的Blob,但未做任何更改:

int bufferLength = 1 * 1024 * 1024;//1 MB chunk
long blobRemainingLength = blob.Properties.Length;
Queue<KeyValuePair<long, long>> queues = new Queue<KeyValuePair<long, long>>();
long offset = 0;
do
{
    long chunkLength = (long)Math.Min(bufferLength, blobRemainingLength);

    offset += chunkLength;
    blobRemainingLength -= chunkLength;
    using (var ms = new MemoryStream())
    {
        await blob.DownloadRangeToStreamAsync(ms, offset, chunkLength);
        ms.Position = 0;
        lock (outPutStream)
        {
            outPutStream.Position = offset;
            var bytes = ms.ToArray();
            outPutStream.Write(bytes, 0, bytes.Length);
        }
    }
}
while (blobRemainingLength > 0);

我认为212 MB数据不是一个很大的JSON文件。你能建议一个吗 解决方案?

1 个答案:

答案 0 :(得分:3)

我建议您可以尝试使用Azure Storage Data Movement Library

我使用220MB的较大文件进行了测试,将其下载到内存大约需要5分钟。

示例代码:

        SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy
        {
            SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15),
            Permissions = SharedAccessBlobPermissions.Read
        };

        CloudBlockBlob blob = blobContainer.GetBlockBlobReference("t100.txt");
        string sasContainerToken = blob.GetSharedAccessSignature(sasConstraints);
        var cloudBlockBlob = new CloudBlockBlob(new Uri(blob.Uri + sasContainerToken));

        var stream = new MemoryStream();

        //set this value as per your need
        TransferManager.Configurations.ParallelOperations = 5;

        Console.WriteLine("begin to download...");

        //use Stopwatch to calculate the time
        Stopwatch stopwatch = new Stopwatch();
        stopwatch.Start();

        DownloadOptions options = new DownloadOptions();
        options.DisableContentMD5Validation = true;

        //use these lines of code just for checking the downloading progress, you can remove it in your code.
        SingleTransferContext context = new SingleTransferContext();
        context.ProgressHandler = new Progress<TransferStatus>((progress) =>
        {
            Console.WriteLine("Bytes downloaded: {0}", progress.BytesTransferred);
        });

        var task = TransferManager.DownloadAsync(cloudBlockBlob, stream,options,context);
        task.Wait();

        stopwatch.Stop();
        Console.WriteLine("the length of the stream is: "+stream.Length);
        Console.WriteLine("the time is taken: "+stopwatch.ElapsedMilliseconds);

测试结果:

enter image description here