Amazon Glacier并行存档上传

时间:2013-04-18 13:56:01

标签: c# multithreading amazon-glacier

我有一个示例应用程序,它使用AWS SDK for .NET高级API将大小为26 MB的文件上传到Amazon Glacier。代码工作正常,没有线程,但线程池在下面的行失败

         client.UploadMultipartPart(uploadMPUrequest);

说错误消息:             请求已中止:请求已取消。

Stack Trace:在Amazon.Runtime.AmazonWebServiceClient.handleHttpWebErrorResponse(AsyncResult asyncResult,WebException we) 在Amazon.Runtime.AmazonWebServiceClient.getRequestStreamCallback(IAsyncResult结果) 在Amazon.Runtime.AmazonWebServiceClient.InvokeConfiguredRequest(AsyncResult asyncResult) 在Amazon.Runtime.AmazonWebServiceClient.InvokeHelper(AsyncResult asyncResult) 在Amazon.Runtime.AmazonWebServiceClient.Invoke(AsyncResult asyncResult) 在Amazon.Glacier.AmazonGlacierClient.invokeUploadMultipartPart(UploadMultipartPartRequest uploadMultipartPartRequest,AsyncCallback回调,对象状态,布尔同步) 在Amazon.Glacier.AmazonGlacierClient.UploadMultipartPart(UploadMultipartPartRequest uploadMultipartPartRequest)

注意:我正在上传多部分数据

请为我的示例代码找到以下链接:            www.page-monitor.com/Downloads/ArchiveUploadMPU.cs

存档的并行上传是否有示例代码?

谢谢和问候, Haseena

2 个答案:

答案 0 :(得分:0)

我相信代码中存在竞争条件。我正在研究相同的功能。我很乐意与您分享代码。如果你修改了你发布的代码,我会很感激它的链接。 最好的祝福, 布鲁斯

答案 1 :(得分:0)

以下是适用于线程的示例代码。 ChunkDetails是一个自定义库,用于传递accessID,bucketname,offset detail等。我也在使用ThrottledStream。

       internal bool UploadUsingHighLevelAPI(String FilePath, ChunkDetails ObjMetaData,
                                            S3Operations.UploadType uploadType,
       Stream inputStream)
       {
       String METHOD_NAME = "UploadUsingHighLevelAPI";
        String keyName;
        String existingBucketName;
        TransferUtilityUploadRequest fileTransferUtilityRequest = null;
        int RetryTimes = 3;
        ThrottledStream throttleStreamObj = null;

        long bps = ThrottledStream.Infinite;
        try
        {


            keyName = ObjMetaData.KeyName;
            existingBucketName = ObjMetaData.BucketName;

            TransferUtility fileTransferUtility = new
                    TransferUtility(ObjMetaData.AccessKeyID,        ObjMetaData.SecretAccessKey);

            FileInfo fin = new FileInfo(FilePath);

            //streamObj = new FileStream(FilePath, FileMode.Open);

            bps = (long)(1024 * ObjMetaData.MaxAvailSpeed * ((double)ObjMetaData.Bandwidth / 100.0));


            throttleStreamObj = new ThrottledStream(ObjMetaData.FileStream, bps);


            System.Collections.Specialized.NameValueCollection metaInfo = new System.Collections.Specialized.NameValueCollection();
            if (ObjMetaData.MetaInfo != null)
            {
                foreach (DictionaryEntry kvp in ObjMetaData.MetaInfo)
                {
                    metaInfo.Add(kvp.Key.ToString(), kvp.Value.ToString());
                }
            }


            long OffDiff = ObjMetaData.EndOffset - ObjMetaData.StartOffset;
            long partSize;
            if (fin.Length >= OffDiff)
            {
                partSize = OffDiff;
            }
            else
                partSize = fin.Length;



            if (uploadType == UploadType.File)
            {
                //fileTransferUtility.Upload(FilePath, existingBucketName, keyName);


                fileTransferUtilityRequest =
                new TransferUtilityUploadRequest()
                .WithBucketName(existingBucketName)
                //.WithFilePath(FilePath)
                .WithStorageClass(S3StorageClass.ReducedRedundancy)
                .WithMetadata(metaInfo)
                .WithPartSize(partSize)
                .WithKey(keyName)
                .WithCannedACL(S3CannedACL.PublicRead)
                .WithTimeout(Int32.MaxValue - 1)
                .WithInputStream(throttleStreamObj) as TransferUtilityUploadRequest;

            }
            else if (uploadType == UploadType.Stream)
            {
                fileTransferUtilityRequest =
               new TransferUtilityUploadRequest()
               .WithBucketName(existingBucketName)
               .WithStorageClass(S3StorageClass.ReducedRedundancy)
               .WithMetadata(metaInfo)
               .WithPartSize(partSize)
               .WithKey(keyName)
               .WithCannedACL(S3CannedACL.PublicRead)
               .WithTimeout(Int32.MaxValue - 1)
               .WithInputStream(throttleStreamObj) as TransferUtilityUploadRequest
               ;
            }



            for (int index = 1; index <= RetryTimes; index++)
            {
                try
                {

                    // Upload part and add response to our list.
                    fileTransferUtility.Upload(fileTransferUtilityRequest);
                    Console.WriteLine(" ====== Upload Done =========");
                    if (eventChunkUploaded != null)
                        eventChunkUploaded(ObjMetaData);
                    break;

                }
                catch (Exception ex)
                {
                    if (index == RetryTimes)
                    {
                        m_objLogFile.LogError(CLASS_NAME, METHOD_NAME + " - Attempt " +
                            index + Environment.NewLine + FilePath, ex);

                        if (eventChunkUploadError != null)
                            eventChunkUploadError(ObjMetaData, ex.Message);

                    }
                }
            }
        }
        catch (Exception ex)
        {
            m_objLogFile.LogError(CLASS_NAME, METHOD_NAME, ex);
            return false;
        }
        finally
        {

            if (throttleStreamObj != null)
            {
                //inputStream1.Close();
                throttleStreamObj = null;
            }
        }

        return true;
   }

如果您遇到任何问题,请告诉我。