我正在尝试创建python脚本来存档和压缩datewise tar文件中的一年旧数据。脚本还生成已归档文件的日志文件。我在linux上使用python 2.6。
这是我的代码:
for search_date in dd_list:
tar_file = "/files/yearly_archive/nas_archive_" + search_date + ".tgz"
mytar = tarfile.open(tar_file,"w:gz")
log_file = "/files/yearly_archive/archive_log_" + search_date
fcount = 0
#print tar_file
#print log_file
f = open(log_file,'ab+')
for f_name, d_date in date_file_dict.iteritems():
if d_date == search_date:
#print f_name
fcount += 1
mytar.add(f_name)
f.write(f_name + '\n')
date_occur_dict[search_date] = fcount
mytar.close()
f.close()
如果存在日志文件,则会附加日志文件,但每次运行脚本时都会覆盖tar文件。有没有办法我可以确保tar文件被附加,如果它存在,否则会被创建?
编辑:
我尝试添加用于解压缩和添加的代码,但它无法正常工作。
for search_date in dd_list:
tar_file = "/files/yearly_archive/nas_archive_" + search_date + ".tgz"
zip = 1
try:
with open(tar_file,'ab+'):
import gzip
d_tar = gzip.open(tar_file,'wb')
zip = 0
except IOError:
print "Creating new tar file"
if zip == 1:
mytar = tarfile.open(tar_file,"w:gz")
else:
mytar = tarfile.open(d_tar,"w")
log_file = "/files/yearly_archive/archive_log_" + search_date
fcount = 0
#print tar_file
#print log_file
f = open(log_file,'ab+')
for f_name, d_date in date_file_dict.iteritems():
if d_date == search_date:
#print f_name
fcount += 1
mytar.add(f_name)
f.write(f_name + '\n')
date_occur_dict[search_date] = fcount
mytar.close()
f.close()
我收到以下错误:
Traceback (most recent call last):
File "sort_archive.py", line 63, in <module>
mytar = tarfile.open(d_tar,"w")
File "/usr/lib64/python2.6/tarfile.py", line 1687, in open
return cls.taropen(name, mode, fileobj, **kwargs)
File "/usr/lib64/python2.6/tarfile.py", line 1697, in taropen
return cls(name, mode, fileobj, **kwargs)
File "/usr/lib64/python2.6/tarfile.py", line 1518, in __init__
fileobj = bltn_open(name, self._mode)
TypeError: coercing to Unicode: need string or buffer, instance found
答案 0 :(得分:1)
您无法使用tarfile
附加到压缩的tarball。单独执行解压缩/压缩步骤,或者首先不使用压缩。
答案 1 :(得分:0)
&LT;删除线&gt;
您是否尝试更改模式?我看到w,它明显覆盖了文件。尝试使用a或w +。
mytar = tarfile.open(tar_file,"w+:gz")
或
mytar = tarfile.open(tar_file,"a:gz")
&LT; / strikethrough&gt;