我该如何清理文件和文件开始流式传输到浏览器后,Azure Local Storage中的目录?

时间:2013-07-24 20:22:34

标签: azure asynchronous local-storage filestream httpresponse

背景:我正在使用Azure Local Storage。这应该被视为“易变”存储。首先,文件和文件有多长;我创建的目录在Web角色实例上保留(在我的例子中有2个)?如果在每个用户完成后不对这些文件/目录进行清理,我是否需要担心存储空间不足?我正在做的是我从一个单独的服务中提取多个文件,将它们存储在Azure本地存储中,将它们压缩成一个zip文件并存储该zip文件,然后最后将该文件串流传输到浏览器。

问题:除了一次轻微的打嗝外,这一切都很美妙。该文件似乎异步流式传输到浏览器。所以当我尝试从azure本地存储中删除压缩文件时,会发生异常,因为它仍然在流式传输到浏览器的过程中。在文件完全流式传输到浏览器之后,强制删除过程发生的最佳方法是什么?

这是我的代码:

                using (Service.Company.ServiceProvider CONNECT = new eZ.Service.CompanyConnect.ServiceProvider())
            {
                // Iterate through all of the files chosen
                foreach (Uri fileId in fileIds)
                {
                    // Get the int file id value from the uri
                    System.Text.RegularExpressions.Regex rex = new System.Text.RegularExpressions.Regex(@"e[B|b]://[^\/]*/\d*/(\d*)");
                    string id_str = rex.Match(fileId.ToString()).Groups[1].Value;
                    int id = int.Parse(id_str);

                    // Get the file object from eB service from the file id passed in
                    eZ.Data.File f = new eZ.Data.File(CONNECT.eZSession, id);
                    f.Retrieve("Header; Repositories");

                    string _fileName = f.Name;

                    try
                    {
                        using (MemoryStream stream = new MemoryStream())
                        {
                            f.ContentData = new eZ.ContentData.File(f, stream);

                            // After the ContentData is created, hook into the event
                            f.ContentData.TransferProgressed += (sender, e) => { Console.WriteLine(e.Percentage); };

                            // Now do the transfer, the event will fire as blocks of data is read
                            int bytesRead;
                            f.ContentData.OpenRead();
                            // Open the Azure Local Storage file stream
                            using (azure_file_stream = File.OpenWrite(curr_user_path + _fileName))
                            {
                                while ((bytesRead = f.ContentData.Read()) > 0)
                                {
                                    // Write the chunk to azure local storage
                                    byte[] buffer = stream.GetBuffer();
                                    azure_file_stream.Write(buffer, 0, bytesRead);
                                    stream.Position = 0;
                                }
                            }
                        }
                    }
                    catch (Exception e)
                    {
                        throw e;
                        //Console.WriteLine("The following error occurred:  " + e);
                    }
                    finally
                    {
                        f.ContentData.Close();
                    }
                } // end of foreach block

            } // end of eB using block

            string sevenZipDllPath = Path.Combine(Utilities.GetCurrentAssemblyPath(), "7z.dll");
            Global.logger.Info(string.Format("sevenZipDllPath: {0}", sevenZipDllPath));
            SevenZipCompressor.SetLibraryPath(sevenZipDllPath);

            var compressor = new SevenZipCompressor
            {
                ArchiveFormat = OutArchiveFormat.Zip,
                CompressionLevel = CompressionLevel.Fast
            };

            // Compress the user directory
            compressor.CompressDirectory(webRoleAzureStorage.RootPath + curr_user_directory, curr_user_package_path + "Package.zip");

            // stream Package.zip to the browser
            httpResponse.BufferOutput = false;
            httpResponse.ContentType = Utilities.GetMIMEType("BigStuff3.mp4");
            httpResponse.AppendHeader("content-disposition", "attachment; filename=Package.zip");

            azure_file_stream = File.OpenRead(curr_user_package_path + "Package.zip");
            azure_file_stream.CopyTo(httpResponse.OutputStream);
            httpResponse.End();

            // Azure Local Storage cleanup
            foreach (FileInfo file in user_directory.GetFiles())
            {
                file.Delete();
            }
            foreach (FileInfo file in package_directory.GetFiles())
            {
                file.Delete();
            }
            user_directory.Delete();
            package_directory.Delete();
        }

2 个答案:

答案 0 :(得分:1)

你可以简单地在机器上运行一个工作,在创建一天之后清理文件吗?这可以像任务调度程序中的批处理文件一样简单,也可以从WebRole.cs启动单独的线程。 如果本地空间低于某个阈值,您甚至可以使用AzureWatch自动重新映像您的实例

答案 1 :(得分:1)

您可以在Windows Azure blob存储中放置文件(尤其是用户下载的最终压缩文件)吗?该文件可以公开,或创建共享访问签名,以便只有您提供URL的人才能下载它。将文件放在blob存储中以供下载可以减轻Web服务器上的一些压力。