我有一个Web服务器,可以生成我想要自动传输到AWS服务器的.tar.zg备份文件。
为了实现这一点,我尝试在AWS服务器上编写一个bash脚本,该脚本将自动检查Web服务器上的新备份,并在备份更新时保留备份副本(保留时间戳)。
是否有更容易或更强大的方法来解决这个问题? 我的FTP脚本语法是否正确?
# Credentials to access other machine
HOST=xxxxxx
USER=xxxxx
PASSWD=xxxxxxx
# path to the remoteBackups
remoteBackups=/home/ubuntu/testBackups
# Loops indefinitly
#while [[ true ]]
#do
# FTP to remote host and get the name most recent backup
ftp -inv $HOST<<-EOT
user $USER $PASSWD
#Store name of most recent backup to FILE
# does this work or will it just save it to a variable FILE on the
remote machine
FILE=`ls -t ~/Desktop/backups/*.tar.gz | head -1`
bye
EOT
# For testing
echo $FILE
# Copy (preserving modification dates) file to the local remote
backups folder on aws server
#scp -p -i <.pem> $FILE $remoteBackups
# Get the most recent back up from both directories
latestLocal=`ls -t ~/intranetBackups/*.tar.gz | head -1`
latestRemote=`ls -t $remoteBackups/*.tar.gz | head -1`
# For testing
echo $latestLocal
echo $latestRemote
# If the backup from the remote is newer then save to backups and
sleep for 15 days
if [[ $latestLocal -ot $latestRemote ]]
then
echo Transferring backup from $latestRemote to $latestLocal
sleep 15d
else
echo No new backup file found
sleep 1d
fi
# If there are more than 20 backups delete the oldest
if [[ `ls -1 ~/intranetBackups | wc -l` -ge 20 ]]
then
rm `ls -t ~/intranetBackuos | tail -1`
echo removed the oldest backup
else
echo no file to be removed
fi
#done