如何创建备份脚本以比较日期和删除最旧的文件

时间:2013-04-03 03:18:38

标签: linux bash date backup

我有一个我正在制作的Minecraft服务器的备份脚本。我已经达到了这样的程度,我可以将所有文件合并到1个文件夹中,将其命名为当前日期和时间,然后将其压缩为1个.zip文件,以便World Edit识别为备份。问题是,我希望这个脚本能够识别当它达到4个备份时,它将开始删除比较日期的最旧的备份。我还需要它,以便在没有4个备份文件时不会出现故障。我该如何解决这个问题。这是我的剧本。

#!/bin/bash
DIR="$(dirname "$0")"
cd $DIR
clear
while [ true ]
do
# Set $DATE to current date and time
DATE="`date +%Yy-%mm-%dd_%Hh-%Mm`"
# Make directory with date and time
mkdir $DATE
# copy all files into 1 folder with date and time
cp ~/Desktop/Test.command ~/Desktop/Test2.command $DIR/$DATE
sleep 1
# compress folder into $DATE and remove previos files
zip $DATE -rm $DATE
# wait for round 2 in 1 hour
echo Waiting 1 hour
sleep 3600
done

3 个答案:

答案 0 :(得分:1)

这是一个半关联的bash脚本,我最近为cron作业做过。它查看存储自动生成的备份的特定目录,如果目录的大小超过用户指定的限制,则删除最旧的备份。同时保留.log文件。

#!/bin/bash

# IFS is needed for bash to recognize spaces
IFS=$'\n'

# Backup directory
BACKUPDIR="your directory here"

# Backup directory size limit (in kB)
DIRLIMIT=20971520

# Start logging
LOGFILE="where you want the log file stored"
echo -e "Cron job started at $(date +"%m/%d/%Y %r")\r" >> $LOGFILE
echo -e "Directory size limit: $DIRLIMIT [kB]\r" >> $LOGFILE

# If $BACKUPDIR exitsts, find its size
if [ -d "$BACKUPDIR" ]
then
    DIRSIZE=$(du -sk "$BACKUPDIR" | awk '{print $1}')
    echo -e "Current directory size: $DIRSIZE [kB]\r" >> $LOGFILE
else
    echo -e "$BACKUPDIR does not exist!\r" >> $LOGFILE

fi

# Check if $BACKUPDIR's size is greater than $DIRLIMIT. If so, delete
# old files. If not, exit.
if [ $DIRSIZE -gt $DIRLIMIT ]
then
    echo -e "Directory size over limit! Attempting to delete oldest log backups...\r" >> $LOGFILE
    LOOPSIZE=$DIRSIZE
    while  [ $LOOPSIZE -gt $DIRLIMIT ]
        do
        # This find command below finds files (-type f) in the $BACKUPDIR directory only 
        # (-maxdepth 1) and it prints their last modified time and filename followed by a new line (-printf "%T@ %p\n").
        # Then it sorts it based on time (sort -n) and selects the file which was modified the furthest in the past. The
        # awk command removes the timestamp (which is in field $1). Finally, xargs -I{} -f {} deletes the file even though spaces are
        # present in the full file name.    
        echo -e "Deleting file: $(find "$BACKUPDIR" -type f -maxdepth 1 -printf "%T@ %f\n" | sort -n | head -1 | awk '{ print substr($0, index($0,$2)) }')\r" >> $LOGFILE
        find "$BACKUPDIR" -type f -maxdepth 1 -printf "%T@ %p\n" | sort -n | head -1 | awk '{ print substr($0, index($0,$2)) }' | xargs -I{} rm -f {} 
        # This function calculates the $BACKUPDIR size again and is used in the loop to see when it is permissable to exit.
        LOOPSIZE=$(du -sk "$BACKUPDIR" | awk '{print $1}')
        echo -e "Directory size is now: $LOOPSIZE [kB]\r" >> $LOGFILE
        done
    echo -e "Operation completed successfully! Final directory size is: $LOOPSIZE [kB]\r" >> $LOGFILE
else
    echo -e "Directory size is less than limit. Exiting.\r" >> $LOGFILE
fi

# Finish logging
echo -e "\r" >> $LOGFILE
echo -e "Cron job exiting at $(date +"%m/%d/%Y %r")\r" >> $LOGFILE
echo -e "----------------------------------------------------------------------\r" >> $LOGFILE

为了确保.log文件不会变得令人讨厌的大,我已经制作了一个cron脚本来修剪最上面的条目:

#!/bin/bash

# IFS is needed for bash to recognize spaces
IFS=$'\n'

# .log files directory
LOGFILESDIR="your directory here"

# .log file size limits (in KB)
LOGFILELIMIT=35

# Find .log files in $LOGFILESDIR and remove the earliest logs if
# the filesize is greater than $LOGFILELIMIT.
for FILE in "$LOGFILESDIR"/*.log
do
    LOGSIZE=$(du -sk "$FILE" | awk '{print $1}')
    while [ $LOGSIZE -gt $LOGFILELIMIT ]
    do
        # The sed command deletes the rows from the top
        # until it encounters a bunch of "-"'s in a row
        # (which separates the logs in the log files)
        sed -i '1,/----------/d' "$FILE"
        LOGSIZE=$(du -sk "$FILE" | awk '{print $1}')
    done    
done
exit

答案 1 :(得分:0)

您好我很久以前编写了一个类似的东西,它会删除7天的文件并从AWS S3中删除它。希望这对你有用。

#!/bin/bash
NOWDATE=`date -I -d '7 days ago'`
BACKUPNAME="$NOWDATE.sql.gz"
/usr/local/bin/s3cmd del s3://path/to/bucket/$BACKUPNAME  

答案 2 :(得分:0)

rm -f `ls -t ????y-??m-??d_??h-??m.zip|sed 1,4d`

rm -f `ls -r ????y-??m-??d_??h-??m.zip|sed 1,4d`

将删除除最新的四个zip文件之外的所有文件(分别根据修改时间或文件名)。

相关问题