使用go将文件流上传到AWS S3

时间:2015-12-09 11:01:53

标签: file-upload go amazon-s3

我希望使用尽可能少的内存和文件磁盘空间将多部分/表单数据(大型)文件直接上传到AWS S3。我怎样才能做到这一点?在线资源仅解释如何上传文件并将其本地存储在服务器上。

5 个答案:

答案 0 :(得分:6)

您可以使用minio-go执行此操作:

n, err := s3Client.PutObject("bucket-name", "objectName", object, size, "application/octet-stream")

PutObject()在内部自动执行分段上传。 Example

答案 1 :(得分:1)

您可以使用upload manager来流式传输文件并上传它,还可以阅读source code中的注释 您还可以配置参数来设置part size, concurrency & max upload parts,下面是示例代码供参考。

package main

import (
    "fmt"
    "os"

    "github.com/aws/aws-sdk-go/aws/credentials"

    "github.com/aws/aws-sdk-go/aws"
    "github.com/aws/aws-sdk-go/aws/session"
    "github.com/aws/aws-sdk-go/service/s3/s3manager"
)

var filename = "file_name.zip"
var myBucket = "myBucket"
var myKey = "file_name.zip"
var accessKey = ""
var accessSecret = ""

func main() {
    var awsConfig *aws.Config
    if accessKey == "" || accessSecret == "" {
        //load default credentials
        awsConfig = &aws.Config{
            Region: aws.String("us-west-2"),
        }
    } else {
        awsConfig = &aws.Config{
            Region:      aws.String("us-west-2"),
            Credentials: credentials.NewStaticCredentials(accessKey, accessSecret, ""),
        }
    }

    // The session the S3 Uploader will use
    sess := session.Must(session.NewSession(awsConfig))

    // Create an uploader with the session and default options
    //uploader := s3manager.NewUploader(sess)

    // Create an uploader with the session and custom options
    uploader := s3manager.NewUploader(sess, func(u *s3manager.Uploader) {
        u.PartSize = 5 * 1024 * 1024 // The minimum/default allowed part size is 5MB
        u.Concurrency = 2            // default is 5
    })

    //open the file
    f, err := os.Open(filename)
    if err != nil {
        fmt.Printf("failed to open file %q, %v", filename, err)
        return
    }
    //defer f.Close()

    // Upload the file to S3.
    result, err := uploader.Upload(&s3manager.UploadInput{
        Bucket: aws.String(myBucket),
        Key:    aws.String(myKey),
        Body:   f,
    })

    //in case it fails to upload
    if err != nil {
        fmt.Printf("failed to upload file, %v", err)
        return
    }
    fmt.Printf("file uploaded to, %s\n", result.Location)
}

答案 2 :(得分:0)

另一种选择是使用goofys安装S3存储桶,然后将写入流传输到挂载点。 goofys不会在本地缓冲内容,因此它可以很好地处理大文件。

答案 3 :(得分:-1)

我没试过,但如果我是你,请尝试多部分上传选项。

您可以阅读文档multipartupload

here是多部分上传和多部分上传中止的示例。

答案 4 :(得分:-2)

亚马逊有一个官方的Go程序包,用于将文件上传到S3。

http://docs.aws.amazon.com/sdk-for-go/api/service/s3/s3manager/Uploader.html

他们还有一个上传文件的示例,同时即时压缩文件。

https://github.com/aws/aws-sdk-go/wiki/common-examples#upload-an-arbitrarily-sized-stream-with-amazon-s3-upload-manager

不确定是否有帮助。你的问题有点模糊。