我们的存储区遇到SMB连接问题,现在我们被迫使用FTP定期访问文件。所以,而不是使用Bash,我试图使用python但我遇到了一些问题。该脚本需要递归搜索FTP目录,并查找比24小时更新的所有文件“* 1700_m30.mp4”。然后在本地复制所有这些文件。
这是我到目前为止所得到的 - 但我似乎无法让脚本下载文件或从文件中获取统计信息,告诉我它们是否比24小时更新。
#!/usr/bin/env python
# encoding: utf-8
import sys
import os
import ftplib
import ftputil
import fnmatch
import time
dir_dest = '/Volumes/VoigtKampff/Temp/TEST1/' # Directory where the files needs to be downloaded to
pattern = '*1700_m30.mp4' #filename pattern for what the script is looking for
print 'Looking for this pattern :', pattern # print pattern
print "logging into GSP" # print
host = ftputil.FTPHost('xxx.xxx','xxx','xxxxx') # ftp host info
recursive = host.walk("/GSPstor/xxxxx/xxx/xxx/xxx/xxxx",topdown=True,onerror=None) # recursive search
for root,dirs,files in recursive:
for name in files:
print 'Files :', files # print all files it finds
video_list = fnmatch.filter(files, pattern)
print 'Files to be moved :', video_list # print list of files to be moved
if host.path.isfile(video_list): # check whether the file is valid
host.download(video_list, video_list, 'b') # download file list
host.close
以下是基于ottomeister(谢谢!!)的优秀建议的修改过的脚本 - 现在的最后一个问题是它下载但是它不断下载文件并覆盖现有文件:
import sys
import os
import ftplib
import ftputil
import fnmatch
import time
from time import mktime
import datetime
import os.path, time
from ftplib import FTP
dir_dest = '/Volumes/VoigtKampff/Temp/TEST1/' # Directory where the files needs to be downloaded to
pattern = '*1700_m30.mp4' #filename pattern for what the script is looking for
print 'Looking for this pattern :', pattern # print pattern
utc_datetime_less24H = datetime.datetime.utcnow()-datetime.timedelta(seconds=86400) #UTC time minus 24 hours in seconds
print 'UTC time less than 24 Hours is: ', utc_datetime_less24H.strftime("%Y-%m-%d %H:%M:%S") # print UTC time minus 24 hours in seconds
print "logging into GSP FTP" # print
with ftputil.FTPHost('xxxxxxxx','xxxxxx','xxxxxx') as host: # ftp host info
recursive = host.walk("/GSPstor/xxxx/com/xxxx/xxxx/xxxxxx",topdown=True,onerror=None) # recursive search
for root,dirs,files in recursive:
for name in files:
print 'Files :', files # print all files it finds
video_list = fnmatch.filter(files, pattern) # collect all files that match pattern into variable:video_list
statinfo = host.stat(root, video_list) # get the stats from files in variable:video_list
file_mtime = datetime.datetime.utcfromtimestamp(statinfo.st_mtime)
print 'Files with pattern: %s and epoch mtime is: %s ' % (video_list, statinfo.st_mtime)
print 'Last Modified: %s' % datetime.datetime.utcfromtimestamp(statinfo.st_mtime)
if file_mtime >= utc_datetime_less24H:
for fname in video_list:
fpath = host.path.join(root, fname)
if host.path.isfile(fpath):
host.download_if_newer(fpath, os.path.join(dir_dest, fname), 'b')
host.close()
答案 0 :(得分:4)
这一行:
video_list = fnmatch.filter(files, pattern)
获取与您的glob模式匹配的文件名列表。但是这一行:
if host.path.isfile(video_list): # check whether the file is valid
是假的,因为host.path.isfile()
不希望文件名列表作为其参数。它想要一个单一的路径名。因此,您需要迭代video_list
一次构造一个路径名,将每个路径名传递给host.path.isfile()
,然后可能下载该特定文件。像这样:
import os.path
for fname in video_list:
fpath = host.path.join(root, fname)
if host.path.isfile(fpath):
host.download(fpath, os.path.join(dir_dest, fname), 'b')
请注意,我使用host.path.join()
来管理远程路径名,使用os.path.join()
来管理本地路径名。另请注意,这会将所有下载的文件放入单个目录中。如果要将它们放入镜像远程布局的目录层次结构中(如果不同远程目录中的文件名可能发生冲突,则必须执行类似的操作),那么您将需要构建不同的目标路径,并且'' ll可能也必须创建本地目标目录层次结构。
要获取时间戳信息,请使用host.lstat()
或host.stat()
,具体取决于您希望如何处理符号链接。
是的,那应该是host.close()
。没有它,连接将在host
变量超出范围并且被垃圾收集后关闭,但最好明确地关闭它。更好的是,使用with
子句确保连接关闭,即使异常导致此代码在到达host.close()
调用之前被放弃,如下所示:
with ftputil.FTPHost('xxx.xxx','xxx','xxxxx') as host: # ftp host info
recursive = host.walk(...)
...