如何使用ansible递归上传文件夹到aws s3

时间:2016-07-15 13:42:42

标签: amazon-web-services amazon-s3 amazon ansible ansible-playbook

我正在使用ansible来部署我的应用程序。 我已经到了想要将我的grrouted资产上传到新创建的存储桶的地步,这就是我所做的: {{hostvars.localhost.public_bucket}}是存储桶名称, {{client}}/{{version_id}}/assets/admin是包含要上传的多级文件夹和资源的文件夹的路径:

- s3:
    aws_access_key: "{{ lookup('env','AWS_ACCESS_KEY_ID') }}"
    aws_secret_key: "{{ lookup('env','AWS_SECRET_ACCESS_KEY') }}"
    bucket: "{{hostvars.localhost.public_bucket}}"
    object: "{{client}}/{{version_id}}/assets/admin"
    src: "{{trunk}}/public/assets/admin"
    mode: put

以下是错误消息:

   fatal: [x.y.z.t]: FAILED! => {"changed": false, "failed": true, "invocation": {"module_name": "s3"}, "module_stderr": "", "module_stdout": "\r\nTraceback (most recent call last):\r\n  File \"/home/ubuntu/.ansible/tmp/ansible-tmp-1468581761.67-193149771659393/s3\", line 2868, in <module>\r\n    main()\r\n  File \"/home/ubuntu/.ansible/tmp/ansible-tmp-1468581761.67-193149771659393/s3\", line 561, in main\r\n    upload_s3file(module, s3, bucket, obj, src, expiry, metadata, encrypt, headers)\r\n  File \"/home/ubuntu/.ansible/tmp/ansible-tmp-1468581761.67-193149771659393/s3\", line 307, in upload_s3file\r\n    key.set_contents_from_filename(src, encrypt_key=encrypt, headers=headers)\r\n  File \"/usr/local/lib/python2.7/dist-packages/boto/s3/key.py\", line 1358, in set_contents_from_filename\r\n    with open(filename, 'rb') as fp:\r\nIOError: [Errno 21] Is a directory: '/home/abcd/efgh/public/assets/admin'\r\n", "msg": "MODULE FAILURE", "parsed": false}

我浏览了文档,但没有找到ansible s3_module的递归选项。 这是一个错误还是我错过了什么?!

3 个答案:

答案 0 :(得分:3)

ansible s3模块不支持目录上传或任何递归。 对于此任务,我建议使用下面的s3cmd检查语法。

command: "aws s3 cp {{client}}/{{version_id}}/assets/admin s3://{{hostvars.localhost.public_bucket}}/ --recursive"

答案 1 :(得分:3)

通过使用ansible,看起来你想要一些幂等的东西,但是ansible不支持s3目录上传或任何递归,所以你可能应该使用aws cli来完成这样的工作:

command: "aws s3 cp {{client}}/{{version_id}}/assets/admin s3://{{hostvars.localhost.public_bucket}}/ --recursive"

答案 2 :(得分:1)

我能够通过迭代我想要上传的目录列表的输出来使用s3模块完成此操作。我通过命令模块运行的小内联python脚本只输出目录中文件路径的完整列表,格式为JSON。

-  name: upload things
   hosts: localhost
   connection: local

   tasks:
     - name: Get all the files in the directory i want to upload, formatted as a json list
       command: python -c 'import os, json; print json.dumps([os.path.join(dp, f)[2:] for dp, dn, fn in os.walk(os.path.expanduser(".")) for f in fn])'
       args:
           chdir: ../../styles/img
       register: static_files_cmd

     - s3:
           bucket: "{{ bucket_name }}"
           mode: put
           object: "{{ item }}"
           src: "../../styles/img/{{ item }}"
           permission: "public-read"
       with_items: "{{ static_files_cmd.stdout|from_json }}"