我正在尝试运行一个可以启动实例然后运行一些命令的python脚本。我知道如果我尝试以userdata的形式创建一个实例,它就可以通过。我想弄清楚的是如果我启动一个已经创建的实例,如何传递它。以下是使用boto3在创建实例时传递简单hello世界的代码:
import boto3
userdata = """#cloud-config
repo_update: true
repo_upgrade: all
packages:
- s3cmd
runcmd:
- echo 'Hello S3!' > /tmp/hello.txt
- aws --region YOUR_REGION s3 cp /tmp/hello.txt s3://YOUR_BUCKET/hello.txt
"""
ec2 = boto3.resource('ec2')
instances = ec2.create_instances(
ImageId='ami-f5f41398', # default Amazon linux
InstanceType='t2.micro',
KeyName='YOUR_SSH_KEY_NAME',
MinCount=1,
MaxCount=1,
IamInstanceProfile={
'Arn': 'YOUR_ARN_ID'
},
SecurityGroupIds=['YOUR_SECURITY_GROUP_NAME'],
UserData=userdata
)
像
这样的东西i = ec2.Instance(id='i-5fea4d42')
i.start('pass commands here: eg echo xx, mv a b/ etc')
答案 0 :(得分:0)
您可以设置UserData并启动实例,它将获取更改: http://boto3.readthedocs.io/en/latest/reference/services/ec2.html#EC2.Client.modify_instance_attribute
但是,UserData脚本通常只设置为运行一次,因此您必须修改它以在每次启动时运行: How do I make cloud-init startup scripts run every time my EC2 instance boots?
另一种方法是在启动后设置SSH并运行脚本,但在我看来,在UserData中推送它更加清晰。
答案 1 :(得分:0)
我能够使用paramiko和boto3以及awscli以自动方式在云上运行所有内容。
import paramiko
import boto3
import time
import subprocess
import awscli
import os
# Starting instance, copying data to instance, running model, copying output to s3
print ("\nCreating ssh session")
session = boto3.Session()
ec2 = session.resource('ec2', region_name='us-east-1')
i = ec2.Instance(id='instance id') # instance id
print('\nstarting deeplearning_ami instance')
i.start()
i.wait_until_running()
i.load()
print ("Waiting for the checks to finish..")
time.sleep(45)
k = paramiko.RSAKey.from_private_key_file(key_path) #your private key for ssh
c = paramiko.SSHClient()
c.set_missing_host_key_policy(paramiko.AutoAddPolicy())
print ("\nConnecting to shell using ssh")
c.connect( hostname = i.public_dns_name, username = "ec2-user", pkey = k )
print ("\nExecuting commands on EC2 instance\n")
stdin , stdout, stderr = c.exec_command(""" mkdir -p model;
printf 'Creating directory structure, if it does not exists \n';
cd model;
mkdir -p data out cp;
printf '\nCopying code from s3 bucket to ec2 instance \n';
aws s3 cp {0} .;
printf '\nDownloading data from s3 bucket to ec2 instance';
aws s3 sync {1} data;
printf '\n Download complete, running model..\n\n';
python theano_test.py;
echo 'Hello S3!' > out/hello.txt;
printf '\n Copying output to s3 bucket \n';
aws s3 sync out {2} """.format(s3_code,s3_data,s3_out), get_pty=True)
for line in iter(lambda: stdout.readline(2048), ""): print(line, end="")
print ("\n \n Processingfinished")
#print ('Stopping instance')
i.stop()
print ('Copying output from s3 bucket to local')
os.system('aws s3 sync ' + s3_out + ' ' + local_out)