如何将文件动态添加到Azure blob存储中存储的zip存档?

时间:2018-06-04 12:11:30

标签: c# azure zip azure-storage sharpziplib

我在Azure中有一个生成大量pdf报告文件并将其存储在blob存储中的过程。我不是单独发送所有这些链接,而是生成一个zip文件并将此链接发送给用户。

这个过程全部在一个过程中完成,并且一直运行良好。最近,我在向zip存档添加文件时遇到OutOfMemory异常错误,我很难找到解决方案。

以下是我用来创建zip文件的代码(注意:使用SharpLibZip库)。目前,在添加大约45个文件(每个文件大约3.5Mb(PDF))后,它失败并出现OutOfMemoryException。当我点击该行时发生故障:zipStream.PutNextEntry(newEntry)。

有谁知道如何改进这个过程?看起来这个级别的zip文件似乎很小。

Using outputMemStream As New MemoryStream()

    Using zipStream As New ICSharpCode.SharpZipLib.Zip.ZipOutputStream(outputMemStream)
          zipStream.SetLevel(7)

          Dim collD3 As UserSurveyReportCollection = GetFileList(RequestID)

          For Each entityD2 As UserSurveyReport In collD3

              Try
                  Dim strF As String = entityD2.FileLocation

                 'Download blob as memorystream and add this stream to the zip file
                 Dim msR As New MemoryStream 
                 msR = objA.DownloadBlobAsMemoryStream(azureAccount, ReportFolder, entityD2.FileName)
                 msR.Seek(0, SeekOrigin.Begin)

                'Determine file name used in zip file archive for item
                 Dim strZipFileName As String = DetermineZipSourceName(entityD2, strFolder, strFileName)

                 'Add MemoryStream to ZipFile Stream
                 Dim newEntry As ICSharpCode.SharpZipLib.Zip.ZipEntry = New ICSharpCode.SharpZipLib.Zip.ZipEntry(strZipFileName)
                 newEntry.DateTime = DateTime.Now

                 zipStream.PutNextEntry(newEntry)
                 msR.CopyTo(zipStream)
                 zipStream.CloseEntry()

                 msR = Nothing
                 zipStream.Flush()

                 intCounter += 1

        End If

    Catch exZip As Exception

    End Try

  Next


    zipStream.IsStreamOwner = False
    zipStream.Finish()
    zipStream.Close()

    outputMemStream.Position = 0

    Dim bytes As Byte() = outputMemStream.ToArray()
    result.Comment = objA.UploadBlob(bytes, azureAccount, ReportFolder, entityReport.FileName).AbsolutePath


    End Using
  End Using

2 个答案:

答案 0 :(得分:1)

我找到了解决方案。这种方法似乎最小化了内存中zip文件创建的内存使用,并将生成的zip存档加载到Azure中的blob存储。这使用本机System.IO.Compression库而不是第三方zip库。

我创建了一个名为ZipModel的类,它只有一个文件名和blob。我创建了这些列表,并将其传递给下面的函数。我希望这可以帮助其他人处于同样的困境。

    Private Function SendBlobsToZipFile(ByVal destinationBlob As CloudBlockBlob, ByVal sourceBlobs As List(Of ZipModel)) As ResultDetail

    Dim result As Boolean = True
    Dim resultCounter as Integer = 0

    Using blobWriteStream As Stream = destinationBlob.OpenWrite()

        Using archive As ZipArchive = New ZipArchive(blobWriteStream, ZipArchiveMode.Create)

            For Each zipM As ZipModel In sourceBlobs
                Try
                    Dim strName As String = String.Format("{0}\{1}", zipM.FolderName, zipM.FileName)
                    Dim archiveEntry As ZipArchiveEntry = archive.CreateEntry(strName, CompressionLevel.Optimal)

                    Using archiveWriteStream As Stream = archiveEntry.Open()
                        zipM.ZipBlob.DownloadToStream(archiveWriteStream)
                        resultCounter  += 1
                    End Using
                Catch ex As Exception

                    result = False

                End Try

            Next

        End Using
    End Using

    Return result


End Function

答案 1 :(得分:0)

对于使用C#交易并想将大型zip文件写入blob存储的任何人:

var blob = container.GetBlockBlobReference(outputFilename);
using (var stream = await blob.OpenWriteAsync())
using (var zip = new ZipArchive(stream, ZipArchiveMode.Create))
{
    for (int i = 0; i < 2000; i++)
    {
        using (var randomStream = CreateRandomStream(2))
        {
            var entry = zip.CreateEntry($"{i}.zip", CompressionLevel.Optimal);
            using (var innerFile = entry.Open())
            {
                await randomStream.CopyToAsync(innerFile);
            }
        }
    }
}

这出奇地好。流到Azure时,应用程序内存约为20Mb,CPU占用率非常低。我已经创建了非常大的输出文件(> 4.5Gb),没有问题