我已经成功列出了可用文件,但我需要知道如何将该文件传递给浏览器以供用户下载而无需将其保存到服务器
以下是我获取文件列表的方法
var azureConnectionString = CloudConfigurationManager.GetSetting("AzureBackupStorageConnectString");
var containerName = ConfigurationManager.AppSettings["FmAzureBackupStorageContainer"];
if (azureConnectionString == null || containerName == null)
return null;
CloudStorageAccount backupStorageAccount = CloudStorageAccount.Parse(azureConnectionString);
var backupBlobClient = backupStorageAccount.CreateCloudBlobClient();
var container = backupBlobClient.GetContainerReference(containerName);
var blobs = container.ListBlobs(useFlatBlobListing: true);
var downloads = blobs.Select(blob => blob.Uri.Segments.Last()).ToList();
答案 0 :(得分:45)
虽然blob内容可以通过Web服务器流式传输,并通过浏览器传输给最终用户,但此解决方案会将负载放在Web服务器上,包括cpu和NIC。
另一种方法是向最终用户提供要下载的所需blob的uri,他们可以在html内容中单击该ub。例如https://myaccount.blob.core.windows.net/mycontainer/myblob.ext
。
这个问题是如果内容是私有的,因为除非使用公共blob,否则上面的uri将无法工作。为此,您可以创建共享访问签名(或服务器存储的策略),然后生成附加到uri的散列查询字符串。这个新的uri在给定的时间内有效(例如10分钟)。
这是为blob创建SAS的一个小例子:
var sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5);
sasConstraints.SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(10);
sasConstraints.Permissions = SharedAccessBlobPermissions.Read;
var sasBlobToken = blob.GetSharedAccessSignature(sasConstraints);
return blob.Uri + sasBlobToken;
请注意,开始时间设置为过去几分钟。这是为了处理时钟漂移。以下是full tutorial我从中获取/修改了此代码示例。
通过使用直接blob访问,您将完全绕过VM / Web角色实例/网站实例(减少服务器负载),并让最终用户直接从blob存储中提取blob内容。您仍然可以使用您的Web应用程序来处理权限,决定要传递的内容等。但是......这可以让您直接链接到blob资源,而不是通过Web服务器传输它们。
答案 1 :(得分:9)
用户点击文件后,服务器会以此
响应var blob = container.GetBlobReferenceFromServer(option);
var memStream = new MemoryStream();
blob.DownloadToStream(memStream);
Response.ContentType = blob.Properties.ContentType;
Response.AddHeader("Content-Disposition", "Attachment;filename=" + option);
Response.AddHeader("Content-Length", blob.Properties.Length.ToString());
Response.BinaryWrite(memStream.ToArray());
非常感谢此解决方案的Dhananjay Kumar
答案 2 :(得分:2)
如果使用ASP.NET(核心),则可以将内容流式传输到浏览器,而无需将文件保存在服务器上,而使用FileStreamResult(即IActionResult)将是更优雅的解决方案。
// Karma configuration file, see link for more information
// https://karma-runner.github.io/0.13/config/configuration-file.html
module.exports = function (config) {
config.set({
basePath: '',
frameworks: ['jasmine', '@angular/cli' , '@angular-devkit/build-angular'],
files: [
{pattern: './node_modules/@angular/material/prebuilt-themes/indigo-pink.css', included: true, watched: true}
],
plugins: [
require('karma-jasmine'),
require('karma-chrome-launcher'),
require('karma-jasmine-html-reporter'),
require('karma-coverage-istanbul-reporter'),
require('@angular-devkit/build-angular/plugins/karma')
],
client:{
clearContext: false // leave Jasmine Spec Runner output visible in browser
},
coverageIstanbulReporter: {
dir: require('path').join(__dirname, 'coverage'), reports: [ 'html', 'lcovonly' ],
fixWebpackSourcePaths: true
},
reporters: ['progress', 'kjhtml'],
port: 9876,
colors: true,
logLevel: config.LOG_INFO,
autoWatch: true,
autoWatchBatchDelay: 3000,
browsers: ['Chrome'],
singleRun: false,
customLaunchers: {
ChromeHeadless: {
base: 'Chrome',
flags: [
// See https://chromium.googlesource.com/chromium/src/+/lkgr/headless/README.md
'--headless',
'--disable-gpu',
// Without a remote debugging port, Google Chrome exits immediately.
'--remote-debugging-port=9222',
],
}
}
});
};
答案 3 :(得分:0)
我已经完成了一个示例,您可以在其中上传和下载blob文件。
using System;
using System.Threading.Tasks;
using System.IO;
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Blob;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using System.Linq;
using System.Collections.Generic;
namespace GetBackup
{
class Program
{
static async Task Main(string[] args)
{
string Config_string = "";
using (StreamReader SourceReader = File.OpenText(@"appsettings.json"))
{
Config_string = await SourceReader.ReadToEndAsync();
}
var config = (JObject)JsonConvert.DeserializeObject(Config_string);
if(config["Application_type"].ToString()== "Backup")
{
string Dir_path = config["Backup_Path"].ToString();
string[] allfiles = Directory.GetFiles(Dir_path, "*.*", SearchOption.AllDirectories);
string storageConnectionString = config["AZURE_STORAGE_CONNECTION_STRING"].ToString();
CloudStorageAccount storageAccount;
if (CloudStorageAccount.TryParse(storageConnectionString, out storageAccount))
{
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("rtddata");
//await cloudBlobContainer.CreateAsync();
string[] ExcludeFiles = config["Exception_File"].ToString().Split(',');
foreach (var file in allfiles)
{
FileInfo info = new FileInfo(file);
if (!ExcludeFiles.Contains(info.Name))
{
string folder = (Dir_path.Length < info.DirectoryName.Length) ? info.DirectoryName.Replace(Dir_path, "") : "";
folder = (folder.Length > 0) ? folder + "/" : "";
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(folder + info.Name);
await cloudBlockBlob.UploadFromFileAsync(info.FullName);
}
}
}
}
else if (config["Application_type"].ToString() == "Restore")
{
string storageConnectionString = config["AZURE_STORAGE_CONNECTION_STRING"].ToString();
CloudStorageAccount storageAccount;
if (CloudStorageAccount.TryParse(storageConnectionString, out storageAccount))
{
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("rtddata");
string Dir_path = config["Restore_Path"].ToString();
IEnumerable<IListBlobItem> results = cloudBlobContainer.ListBlobs(null,true);
foreach (IListBlobItem item in results)
{
string name = ((CloudBlockBlob)item).Name;
if (name.Contains('/'))
{
string[] subfolder = name.Split('/');
if (!Directory.Exists(Dir_path + subfolder[0]))
{
Directory.CreateDirectory(Dir_path + subfolder[0]);
}
}
CloudBlockBlob blockBlob = cloudBlobContainer.GetBlockBlobReference(name);
string path = (Dir_path + name);
blockBlob.DownloadToFile(path, FileMode.Create);
}
}
}
}
}
}