我有一种方案,可以通过Window Service将数据库备份上传到Azure Blob存储。 它适用于300-500 MB之间的bak文件,但如果大小超过700 MB至1 GB或更大。花了一个多小时,然后引发异常。
请检查下面的代码,让我知道我在做错什么,以及将大文件上传到Blob存储的有效方法是什么。我已经尝试了这两种方法。
public static void UploadFile(AzureOperationHelper azureOperationHelper)
{
CloudBlobContainer blobContainer = CreateCloudBlobContainer(tenantId, applicationId,
clientSecret, azureOperationHelper.storageAccountName, azureOperationHelper.containerName,
azureOperationHelper.storageEndPoint);
blobContainer.CreateIfNotExists();
var writeOptions = new BlobRequestOptions()
{
SingleBlobUploadThresholdInBytes = 50 * 1024 * 1024,//maximum for 64MB,32MB by default
ParallelOperationThreadCount = 12,
};
CloudBlockBlob blob = blobContainer.GetBlockBlobReference(azureOperationHelper.blobName);
//blob.UploadFromFile(azureOperationHelper.srcPath);
blob.UploadFromFile(azureOperationHelper.srcPath, options: writeOptions);
}
public static void UploadFileStream(AzureOperationHelper azureOperationHelper)
{
CloudBlobContainer blobContainer = CreateCloudBlobContainer(tenantId, applicationId,
clientSecret, azureOperationHelper.storageAccountName, azureOperationHelper.containerName,
azureOperationHelper.storageEndPoint);
blobContainer.CreateIfNotExists();
CloudBlockBlob blob = blobContainer.GetBlockBlobReference(azureOperationHelper.blobName);
//byte[] contents = File.ReadAllBytes(azureOperationHelper.srcPath);
//var writeOptions = new BlobRequestOptions()
//{
// SingleBlobUploadThresholdInBytes = 50 * 1024 * 1024,//maximum for 64MB,32MB by default
// ParallelOperationThreadCount = 12,
//};
//blob.UploadFromByteArray(contents, 0, contents.Length, AccessCondition.GenerateIfNotExistsCondition(), options: writeOptions);
blob.StreamWriteSizeInBytes = 100 * 1024 * 1024; //100 MB
blob.UploadFromFile(string.Format(azureOperationHelper.srcPath));
//using (var fs = new FileStream(azureOperationHelper.srcPath, FileMode.Open))
//{
// blob.UploadFromStream(fs);
//}
}
下面是我遇到的例外情况。
Microsoft.WindowsAzure.Storage.StorageException:远程服务器返回错误:(403)禁止。 ---> System.Net.WebException:远程服务器返回错误:(403)禁止。在Microsoft.WindowsAzure.Storage.Shared.Protocol.HttpResponseParsers.ProcessExpectedStatusCodeNoException [T](HttpStatusCode ExpectedStatusCode,HttpStatusCode actualStatusCode,T retVal,StorageCommandBase`1 cmd,Exception异常)
Microsoft.WindowsAzure.Storage.StorageException:客户端无法在指定的超时时间内完成操作。 ---> System.TimeoutException:客户端无法在指定的超时时间内完成操作。
答案 0 :(得分:0)
请使用下面的代码,它对我而言效果很好(大约2GB的文件,大约需要10分钟才能完成上传):
public string UploadFile(string sourceFilePath)
{
try
{
string storageAccountConnectionString = "AZURE_CONNECTION_STRING";
CloudStorageAccount StorageAccount = CloudStorageAccount.Parse(storageAccountConnectionString);
CloudBlobClient BlobClient = StorageAccount.CreateCloudBlobClient();
CloudBlobContainer Container = BlobClient.GetContainerReference("container-name");
Container.CreateIfNotExists();
CloudBlockBlob blob = Container.GetBlockBlobReference( Path.GetFileName(sourceFilePath) );
HashSet<string> blocklist = new HashSet<string>();
byte[] fileContent = File.ReadAllBytes(sourceFilePath);
const int pageSizeInBytes = 10485760;
long prevLastByte = 0;
long bytesRemain = fileContent.Length;
do
{
long bytesToCopy = Math.Min(bytesRemain, pageSizeInBytes);
byte[] bytesToSend = new byte[bytesToCopy];
Array.Copy(fileContent, prevLastByte, bytesToSend, 0, bytesToCopy);
prevLastByte += bytesToCopy;
bytesRemain -= bytesToCopy;
//create blockId
string blockId = Guid.NewGuid().ToString();
string base64BlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(blockId));
blob.PutBlock(
base64BlockId,
new MemoryStream(bytesToSend, true),
null
);
blocklist.Add(base64BlockId);
} while (bytesRemain > 0);
//post blocklist
blob.PutBlockList(blocklist);
return "Success";
}
catch (Exception ex)
{
return ex.Message;
}
}
它对于上传大文件非常有效(有人从here提供了解决方案)。
请告诉我您是否可以使用它。