我想将文件从远程主机上传到s3存储桶,但要使用本地执行环境中的凭据。有可能吗?
- name: Upload file
host: '{{target}}'
gather_facts : False
tasks:
- name: copy file to bucket
become: yes
aws_s3:
bucket={{bucket_name}}
object={{filename}}
src=/var/log/{{ filename }}
mode=put
我可以使用选项吗?最好的是这样的:
AWS_PROFILE=MyProfile ansible-playbook upload_file.yml -e target=somehost -e bucket_name=mybucket -e filename=myfile
所以我可以从自己的本地.aws/config
文件中指定配置文件。
很明显,当像这样运行剧本时:
ansible-playbook upload_file.yml -e target=somehost -e bucket_name=mybucket -e filename=myfile
我遇到以下错误:
TASK [copy file to bucket] ******************************************************************************************************************************************************************************************************************************
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: NoCredentialsError: Unable to locate credentials
fatal: [somehost]: FAILED! => {"boto3_version": "1.7.50", "botocore_version": "1.10.50", "changed": false, "msg": "Failed while looking up bucket (during bucket_check) adverity-trash.: Unable to locate credentials"}
但是当我尝试以下操作时:
AWS_ACCESS_KEY=<OWN_VALID_KEY> AWS_SECRET_KEY=<OWN_VALID_SECRET> ansible-playbook upload_file.yml -e target=somehost -e bucket_name=mybucket -e filename=myfile
这是同样的错误。
Ansible v2.6
答案 0 :(得分:2)
这里的问题是:如何将环境变量从一台主机传递到另一台主机。答案是在hostvars中。随时在hostvars上进行自己的搜索,但这会给出一个大致的概念:https://docs.ansible.com/ansible/latest/reference_appendices/faq.html#how-do-i-see-all-the-inventory-vars-defined-for-my-host
步骤1:从本地主机(从中运行ansible)收集AWS环境凭证。重要说明:确保将collect_facts设置为TRUE,否则查找Jinja2插件将找不到密钥(假设您已将其设置为localhost中的环境变量)。
- name: Set Credentials
host: localhost
gather_facts : true
tasks:
- name: Set AWS KEY ID
set_fact: AWS_ACCESS_KEY_ID="{{ lookup('env','AWS_ACCESS_KEY_ID') }}"
- name: Set AWS SECRET
set_fact: AWS_SECRET_ACCESS_KEY="{{ lookup('env','AWS_SECRET_ACCESS_KEY') }}"
第2步:使用set_fact和hostvars Jinja2插件从本地主机导入那些环境变量。
第3步:在{{target}}上使用环境变量
下面将步骤2和3放在一起。
- name: Upload file
host: '{{target}}'
gather_facts : False
tasks:
- name: Get AWS KEY ID
set_fact: aws_key_id={{hostvars['localhost']['AWS_ACCESS_KEY_ID']}}
- name: Get AWS SECRET KEY
set_fact: aws_secret_key={{hostvars['localhost']['AWS_SECRET_ACCESS_KEY']}}
- name: copy file to bucket
become: yes
aws_s3:
bucket={{bucket_name}}
object={{filename}}
src=/var/log/{{ filename }}
mode=put
aws_access_key='{{aws_key_id}}'
aws_secret_key='{{aws_secret_key}}'
答案 1 :(得分:1)
他是我问题的令人满意的解决方案。
借助@einarc和ansible hostvars,我能够使用来自本地环境的凭据实现远程上传功能 事实收集不是必需的,我使用了委托委托在本地执行一些任务。一切都在一本剧本里
- name: Transfer file
hosts: '{{ target }}'
gather_facts : False
tasks:
- name: Set AWS KEY ID
set_fact: aws_key_id="{{ lookup('env','AWS_ACCESS_KEY_ID') }}"
delegate_to: 127.0.0.1
- name: Set AWS SECRET
set_fact: aws_secret_key="{{ lookup('env','AWS_SECRET_ACCESS_KEY') }}"
delegate_to: 127.0.0.1
- name: Get AWS KEY ID
set_fact: aws_key_id={{hostvars[inventory_hostname]['aws_key_id']}}
- name: Get AWS SECRET KEY
set_fact: aws_secret_key={{hostvars[inventory_hostname]['aws_secret_key']}}
- name: ensure boto is available
become: true
pip: name=boto3 state=present
- name: copy file to bucket
become: yes
aws_s3:
aws_access_key={{aws_key_id}}
aws_secret_key={{aws_secret_key}}
bucket=my-bucket
object={{filename}}
src=/some/path/{{filename}}
mode=put
奖金: 我找到了一种不将aws凭证明确放置在命令行中的方法。
我已经使用以下bash包装器在aws-cli
的帮助下从配置文件中获取凭据。
#!/bin/bash
AWS_ACCESS_KEY_ID=`aws configure get aws_access_key_id --profile $1`
AWS_SECRET_ACCESS_KEY=`aws configure get aws_secret_access_key --profile $1`
AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \
AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY \
ansible-playbook transfer_to_s3.yml -e target=$2 -e filename=$3
答案 2 :(得分:0)
出现错误是因为运行剧本时环境变量不会传播到远程主机。
作为文档explains(页面底部),可以将环境变量或boto配置文件与aws_s3
一起使用,但它们应该存在于执行推送的主机上。
所以我要做的是
aws_s3
模块。---
aws_access_key_id: 24d32dsa24da24sa2a2ss
aws_access_key: 2424dadsxxx
[Credentials]
aws_access_key_id = {{ aws_access_key_id }}
aws_secret_access_key = {{ aws_access_key }}
- name: Upload file
host: '{{ target }}'
gather_facts : False
vars_files:
- vars/aws.yml
tasks:
- name: push boto template
template:
src: boto.j2
dest: {{ ansible_user_dir }}/.boto
mode: 0400
- name: copy file to bucket
become: yes
aws_s3:
bucket={{bucket_name}}
object={{filename}}
src=/var/log/{{ filename }}
mode=put
ps:
aws_s3
任务答案 3 :(得分:-1)
为什么不使用aws cli完成任务?
您可以先安装aws cli
apt-get install awscli
然后使用以下命令复制文件
aws s3 cp <source> <destination>
让我知道您是否遇到任何问题