使用bash脚本将两个最新文件复制到另一个目录

时间:2015-05-26 12:20:04

标签: linux bash backup

我正在尝试创建一个bash脚本来创建MySQL db&的每日备份。一个网络目录。然后它应该tar然后将两个最近的.tar.gz文件复制到每周第0天的每周目录,每月第1天的月目录和每年第1天的年目录。

我在尝试将'复制两个最新文件'部分工作时遇到问题。

到目前为止我使用了https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script中的脚本作为基础。):

#!/bin/sh
# https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script
# Local Source
SOURCE=/path/to/source
# Create directories etc here
DIR=/path/to/backups
# Local Destination
DESTINATION=/path/to/network/share

# Direct all output to logfile found here
#LOG=$$.log
#exec > $LOG 2>&1

# Database Backup User
DATABASE='wordpress'
DATABASE_USER='dbuser'
DATABASE_PASSWORD='password'
DATABASE_HOST='localhost'

# DO NOT EDIT ANYTHING BELOW THIS
# Date Variables
DAY_OF_YEAR=$(date '+%j')
DAY_OF_MONTH=$(date '+%d')
DAY_OF_WEEK_RAW=$(date '+%w')
WEEK_OF_YEAR=$(date '+%W')
DAY_OF_WEEK=$((DAY_OF_WEEK_RAW + 1))
DAY=$(date '+%a')
DOW=$(date '+%u')
NOW=$(date +"%Y-%m-%d-%H%M")
MONTH=$(date '+%m')
YEAR=$(date '+%Y')

#LATEST=$(ls -t | head -1)
#LATEST_DAILY=$(find $DIR/tmp/daily/ -name '*.tar.gz' | sort -n | tail -3)
#DAILY=$(find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2)
#DAILY=$(ls -1tr $DIR/tmp/daily/ | tail -2 )
DAILY=$(find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2)

# Direct all output to logfile found here
# LOG=$DIR/logs/$$.log
# exec > $LOG 2>&1

# Make Temporary Folder
if [ ! -d "$DIR/tmp" ]; then
        mkdir "$DIR/tmp"
        echo 'Created tmp directory...'
fi

# Make Daily Folder
if [ ! -d "$DIR/tmp/daily" ]; then
        mkdir "$DIR/tmp/weekly"
        echo 'Created daily directory...'
fi

# Make Weekly Folder
if [ ! -d "$DIR/tmp/weekly" ]; then
        mkdir "$DIR/tmp/weekly"
        echo 'Created weekly directory...'
fi

# Make Folder For Current Year
if [ ! -d "$DIR/tmp/${YEAR}" ]; then
        mkdir "$DIR/tmp/${YEAR}"
        echo 'Directory for current year created...'
fi

# Make Folder For Current Month
if [ ! -d "$DIR/tmp/${YEAR}/$MONTH" ]; then
        mkdir "$DIR/tmp/${YEAR}/$MONTH"
        echo '...'Directory for current month created
fi

# Make The Daily Backup
tar -zcvf $DIR/tmp/daily/${NOW}_files.tar.gz $SOURCE
mysqldump -h $DATABASE_HOST -u $DATABASE_USER -p$DATABASE_PASSWORD $DATABASE > $DIR/tmp/database.sql
tar -zcvf $DIR/tmp/daily/${NOW}_database.tar.gz $DIR/tmp/database.sql
rm -rf  $DIR/tmp/database.sql
echo 'Made daily backup...'

# Check whether it's Sunday (0), if so, then copy most recent daily backup to weekly dir.
        if [ $DOW -eq 2 ] ; then
               cp $DAILY $DIR/tmp/weekly/
        fi
                echo 'Made weekly backup...'

# Check whether it's the first day of the year then copy two most recent daily backups to $YEAR folder
        if [ $DAY_OF_YEAR -eq 146 ] ; then
                cp $DAILY $DIR/tmp/${YEAR}/
        fi
                echo 'Made annual backup...'

# Check if it's the first day of the month, if so, copy the latest daily backups to the monthly folder
        if [ $DAY_OF_MONTH -eq 26 ] ; then
                cp $DAILY $DIR/tmp/${YEAR}/${MONTH}/
        fi
                echo 'Made monthly backup...'

# Merge The Backup To The Local Destination's Backup Folder
# cp -rf $DIR/tmp/* $DESTINATION
# Delete The Temporary Folder
# rm -rf $DIR/tmp
# Delete daily backups older than 7 days
# find $DESTINATION -mtime +7 -exec rm {} \;
echo 'Backup complete. Log can be found under $DIR/logs/.'

我现在已经注释掉了一些部分,而我正努力让这个部分工作,并且我将日/月/年设置为今天所以我可以看到文件被复制。我还留下了以前注释掉的$ DAILY变量。

我得到的问题是,在执行脚本时,它会返回以下内容:

./backup-rotation-script.sh                            
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory   
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made weekly backup...                                                       
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory   
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made annual backup...                                                       
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory   
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made monthly backup...                                                      
Backup complete. Log can be found under /path/to/backups/logs/.  

但是当我检查/ path / to / backups / tmp / daily /时,文件就在那里并且它清楚地看到它们,因为它在错误中返回文件名。

从我可以收集到的,这是因为$ DAILY(找到$ DIR / tmp / daily / -name * .tar.gz | sort -n | head -2)在一行返回两个结果?我假设最简单的方法是让这个工作可能是创建一个for循环,将两个结果复制到每周/每月/每年的目录?

我尝试添加变体:

for file in `ls -1t /path/to/backups/tmp/daily/ | head -n2`
do
   cp $file /path/to/backups/tmp/weekly/
done

但它并没有那么顺利。 :S

理想情况下,我也希望它报告它是否失败,但我还没有那么远。 :)

非常感谢任何帮助!

1 个答案:

答案 0 :(得分:2)

没关系!想出来了。

我删除了每日'完全变量并使用以下内容代替副本:

find $DIR/tmp/daily/ -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIR/tmp/weekly/

所以脚本现在看起来像:

#!/bin/sh
# Original script: https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script
# Edited/hacked/chopped/stuff by Khaito

# Redirect all script output to log file located in log directory with date in name.
exec 3>&1 4>&2
trap 'exec 2>&4 1>&3' 0 1 2 3 RETURN
exec 1>/path/to/logs/$(date +"%Y-%m-%d-%H%M")_intranet.log 2>&1

# Local Source
SOURCE=/path/to/source
# Create directories etc here
LOCAL=/path/to/backups
DIR=/path/to/backups/intranet
DIRD=/path/to/backups/intranet/daily
DIRW=/path/to/backups/intranet/weekly
DIRM=/path/to/backups/intranet/monthly

# Local Destination
DESTINATION=/path/to/network/share

# Database Backup User
DATABASE='dbname'
DATABASE_USER='dbuser'
DATABASE_PASSWORD='password'
DATABASE_HOST='localhost'

# DO NOT EDIT ANYTHING BELOW THIS
# Date Variables
DAY_OF_YEAR=$(date '+%j')
DAY_OF_MONTH=$(date '+%d')
DAY_OF_WEEK_RAW=$(date '+%w')
WEEK_OF_YEAR=$(date '+%W')
DAY_OF_WEEK=$((DAY_OF_WEEK_RAW + 1))
DAY=$(date '+%a')
NOW=$(date +"%Y-%m-%d-%H%M")
MONTH=$(date '+%m')
YEAR=$(date '+%Y')
DOW=$(date '+%u')
YEARMONTH=$(date +"%Y-%m-%B")

# Make Daily Folder
if [ ! -d "$LOCAL/intranet" ]; then
        mkdir "$DIR/intranet"
        echo 'Intranet directory created...'
fi

# Make Daily Folder
if [ ! -d "$DIR/daily" ]; then
        mkdir "$DIR/daily"
        echo 'Daily directory created...'
fi

# Make Weekly Folder
if [ ! -d "$DIR/weekly" ]; then
        mkdir "$DIR/weekly"
        echo 'Weekly directory created...'
fi

# Make Folder For Current Month
if [ ! -d "$DIR/monthly" ]; then
        mkdir "$DIR/monthly"
        echo 'Monthly directory created...'
fi

# Make Folder For Current Year
if [ ! -d "$DIR/${YEAR}" ]; then
        mkdir "$DIR/${YEAR}"
        echo 'Directory for current year created...'
fi

# Tar the intranet files then dump the db, tar it then remove the original dump file.
tar -cvzf $DIRD/${NOW}_files.tar.gz $SOURCE
mysqldump -h $DATABASE_HOST -u $DATABASE_USER -p$DATABASE_PASSWORD $DATABASE > $DIR/database.sql
tar -cvzf $DIRD/${NOW}_database.tar.gz $DIR/database.sql
rm -rf  $DIR/database.sql
echo 'Made daily backup...'

# Check if it's Sunday (0), if so, copy the two most recent daily files to the weekly folder.
        if [ $DOW -eq 0 ] ; then
                find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIRW
        fi
                echo 'Made weekly backup...'

# Check if it's the first day of the month, if so, copy the two most recent daily files to the monthly folder
        if [ $DAY_OF_MONTH -eq 1 ] ; then
                find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIRM
        fi
                echo 'Made monthly backup...'

# Check if it's the first day of the year, if so, copy the two most recent daily files to the current year folder
        if [ $DAY_OF_YEAR -eq 1 ] ; then
                find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIR/${YEAR}/
        fi
                echo 'Made annual backup...'

# Rsync the new files to the network share for backup to tape
rsync -hvrPt $DIR/* $DESTINATION

# Delete local backups
# find $DIRD -mtime +8 -exec rm {} \;
# find $DIRW -mtime +15 -exec rm {} \;
# find $DIRM -mtime +2 -exec rm {} \;
# find $DIR/${YEAR} -mtime +2 -exec rm {} \;

# Delete daily backups older than 7 days on network share
# find $INTRANETDESTINATION/daily -mtime +8 -exec rm {} \;
# Delete weekly backups older than 31 days on network share
# find $INTRANETDESTINATION/weekly -mtime +32 -exec rm {} \;
# Delete monthly backups older than 365 days on network share
# find $INTRANETDESTINATION/monthly -mtime +366 -exec rm {} \;

echo 'Backup complete. Log can be found under /path/to/logs/.'