使用Golang的AWS S3并行下载

时间:2019-01-29 11:34:54

标签: amazon-web-services go amazon-s3 aws-sdk

我正在编写一个函数,使用aws-sdk从AWS S3存储桶下载大文件(9GB)。我需要对此进行优化并快速下载文件。

func DownloadFromS3Bucket(bucket, item, path string) {
    os.Setenv("AWS_ACCESS_KEY_ID", constants.AWS_ACCESS_KEY_ID)
    os.Setenv("AWS_SECRET_ACCESS_KEY", constants.AWS_SECRET_ACCESS_KEY)

    file, err := os.Create(filepath.Join(path, item))
    if err != nil {
        fmt.Printf("Error in downloading from file: %v \n", err)
        os.Exit(1)
    }

    defer file.Close()

    sess, _ := session.NewSession(&aws.Config{
        Region: aws.String(constants.AWS_REGION)},
    )

    downloader := s3manager.NewDownloader(sess)

    numBytes, err := downloader.Download(file,
        &s3.GetObjectInput{
            Bucket: aws.String(bucket),
            Key:    aws.String(item),
        })
    if err != nil {
        fmt.Printf("Error in downloading from file: %v \n", err)
        os.Exit(1)
    }

    fmt.Println("Download completed", file.Name(), numBytes, "bytes")
}

有人可以提出扩展此功能的解决方案。

1 个答案:

答案 0 :(得分:4)

尝试将您的NewDownLoader()更改为此。参见https://docs.aws.amazon.com/sdk-for-go/api/service/s3/s3manager/#NewDownloader

// Create a downloader with the session and custom options
downloader := s3manager.NewDownloader(sess, func(d *s3manager.Downloader) {
     d.PartSize = 64 * 1024 * 1024 // 64MB per part
     d.Concurrency = 4
})

可以用d设置的选项列表。在功能中可以在这里找到 https://docs.aws.amazon.com/sdk-for-go/api/service/s3/s3manager/#Downloader