将具有相同文件名的多个文件从服务器复制到本地,并使用python paramiko

时间:2020-04-04 12:16:00

标签: python ssh sftp paramiko

我想使用paramiko(针对学校项目)将具有相同名称的多个文件从服务器复制到本地。但是,我希望有一个服务器列表,脚本可以通过该服务器并执行相同的代码,还可以检测服务器是否在线。我该怎么做?

我不需要几个相同的文件。我只需要拉特定的“ dblatmonstat”文件。

文件名pc_dblatmonstat_dpc01n1_scl000101014.log的示例

就像:首先经历...

dpc01n1.sccloud.xxx.com

然后通过... dpc02n1.sccloud.xxx.com

...等等,依此类推。

这是我到目前为止所拥有的:

import os
import paramiko
import re

#Create log file
#paramiko.util.log_to_file('/$PMRootDir/SrcFiles/logfetcher.log')
#paramiko.util.load_host_keys(os.path.expanduser('~/.ssh/known_hosts'))

#Credentials
host = 'dpc01n1.sccloud.xxx.com'
port = 22
username = 'pi'
password = 'pi'

#Locations
files = re.search('?<=pc_dblatmonstat_dpc01n1_)\w+', files)
print('There are files:', files)
remote_path = '/home/pi/Desktop/logs'
local_path = r'C:\Users\urale\Desktop\logs'


#Opening ssh and ftp
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(
            paramiko.AutoAddPolicy())
ssh.connect(host, username, port, password)
sftp = ssh.open_sftp()

#Getting files
for file in files:
    file_remote = remote_path + file
    file_local = local_path + file

    print (file_remote) + '>>>' + file_local

    #sftp.get(file_remote, file_local)
    sftp.put(file_local, file_remote)

sftp.close()
ssh.close()

编辑:

此版本不断下载同一文件。文件下载完毕并移至下一台服务器后,如何中断循环? 我还尝试使用re.search函数以仅下载pc_dblatmonstat_xxxxxxxxxx_xxxxxxxxxxxxx.log文件。 re.search应该与dblatmonstat_ _ 相匹配。log这样的内容...

import os
import paramiko
import re

# You could add the local_path to the function to define individual places for the 
# files that you download.
Lpath = 'C:\\'
Lpath1 = 'Users'
Lpath2 = 'urale'
Lpath3 = 'Desktop'
Lpath4 = 'logs\\'
local_path = os.path.join(Lpath, Lpath1, Lpath2, Lpath3, Lpath4)

Rpath1 = 'home'
Rpath2 = 'pi'
Rpath3 = 'Desktop'
Rpath4 = 'logs'
remote_path = os.path.join(Rpath1, Rpath2, Rpath3, Rpath4)

# 1. Create function
def get_server_files(local_path, host, port, username, password, remote_path, files):
    #Opening ssh and ftp
    ssh = paramiko.SSHClient()
    ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
    ssh.connect(host, port, username, password)
    sftp = ssh.open_sftp()

    #Getting files
    for file in files:

        file_remote = remote_path + files
        file_local = local_path + files

        print(file_remote, '>>>', file_local)

        sftp.get(file_remote, file_local)
        #sftp.put(file_local, file_remote)

    sftp.close()
    ssh.close()

# 2. list of servers
# Add new dictionary for each server to this list
list_of_servers = [
    { 'host': '192.168.1.64',
      'port': 22, 
      'username': 'pi', 
      'password': 'pi', 
      'remote_path': '/home/pi/Desktop/logs/', 
      'files':  'pc_dblatmonstat_dpc01n1_scl000101014.log'
      }
]

# 3. Iterate through the list_of_servers, using the function above
for server in list_of_servers:
    get_server_files(local_path, **server)

1 个答案:

答案 0 :(得分:1)

我还没有测试以下内容,但是它应该可以工作,并为您提供解决问题的方法。

  1. 将脚本转换为功能
  2. 创建服务器列表
  3. 使用该功能遍历列表以检索文件

这反映在下面

import os
import paramiko
import re

# 1. Create function
def get_server_files(local_path, host, port, username, password, remote_path, file_pattern):
    """Connects to host and searches for files matching file_pattern
    in remote_path. Downloads all matches to 'local_path'"""
    #Opening ssh and ftp
    ssh_con = paramiko.SSHClient()
    ssh_con.set_missing_host_key_policy(paramiko.AutoAddPolicy())
    ssh_con.connect(host, port, username, password)
    sftp_con = ssh_con.open_sftp()

    # Finding files
    all_files_in_path = sftp_con.listdir(path=remote_path)
    r = re.compile(file_pattern)
    files = list(filter(r.match, all_files_in_path))

    #Download files
    for file in files:
        file_remote = remote_path + file
        file_local = local_path + file

        print(file_remote) + '>>>' + file_local

        sftp_con.get(file_remote, file_local)
        #sftp_con.put(file_local, file_remote)

sftp_con.close()
ssh_con.close()

# 2. list of servers
# Add new dictionary for each server to this list
list_of_servers = [
    { 'host': 'dpc01n1.sccloud.xxx.com',
      'port': 22, 
      'username': 'pi', 
      'password': 'pi', 
      'remote_path': '/home/pi/Desktop/logs', 
      'file_pattern': 'pc_dblatmonstat_dpc01n1'}
]

# You could add the local_path to the function to define individual places for the
# files that you download.
local_path = r'C:\Users\urale\Desktop\logs'

# 3. Iterate through the list_of_servers, using the function above
for server in list_of_servers:
    get_server_files(local_path, **server)