在ansible中,如何在同一台远程计算机上同步2个文件夹?

时间:2016-08-05 21:41:19

标签: ansible rsync ansible-playbook ansible-2.x

我有以下简单的任务:

将文件夹A中的所有内容复制到文件夹B.由于组中有很多主机,因此我使用以下yaml任务定义:

- name: Sync /etc/spark/conf to $SPARK_HOME/conf
  synchronize: src=/etc/spark/conf dest={{spark_home}}/conf
  delegate_to: "{{item}}"
  with_items: "{{play_hosts}}"
  tags: spark

然而,运行ansible-playbook给了我以下错误:

TASK [cloudera : Sync /etc/spark/conf to $SPARK_HOME/conf] *********************
failed: [52.53.220.119 -> 52.53.200.0] (item=52.53.200.0) => {"cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/peng/.ssh/saphana.pem -S none -o StrictHostKeyChecking=no' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' \"/etc/spark/conf\" \"52.53.220.119:/opt/spark/spark-1.6.2-bin-hadoop2.4/conf\"", "failed": true, "item": "52.53.200.0", "msg": "Warning: Identity file /home/peng/.ssh/saphana.pem not accessible: No such file or directory.\nPermission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: unexplained error (code 255) at io.c(226) [sender=3.1.0]\n", "rc": 255}
failed: [52.53.200.193 -> 52.53.200.0] (item=52.53.200.0) => {"cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/peng/.ssh/saphana.pem -S none -o StrictHostKeyChecking=no' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' \"/etc/spark/conf\" \"52.53.200.193:/opt/spark/spark-1.6.2-bin-hadoop2.4/conf\"", "failed": true, "item": "52.53.200.0", "msg": "Warning: Identity file /home/peng/.ssh/saphana.pem not accessible: No such file or directory.\nPermission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: unexplained error (code 255) at io.c(226) [sender=3.1.0]\n", "rc": 255}
ok: [52.53.200.0 -> 52.53.200.0] => (item=52.53.200.0)
ok: [52.53.220.119 -> 52.53.220.119] => (item=52.53.220.119)
failed: [52.53.200.193 -> 52.53.220.119] (item=52.53.220.119) => {"cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/peng/.ssh/saphana.pem -S none -o StrictHostKeyChecking=no' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' \"/etc/spark/conf\" \"52.53.200.193:/opt/spark/spark-1.6.2-bin-hadoop2.4/conf\"", "failed": true, "item": "52.53.220.119", "msg": "Warning: Identity file /home/peng/.ssh/saphana.pem not accessible: No such file or directory.\nPermission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: unexplained error (code 255) at io.c(226) [sender=3.1.0]\n", "rc": 255}
failed: [52.53.200.0 -> 52.53.220.119] (item=52.53.220.119) => {"cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/peng/.ssh/saphana.pem -S none -o StrictHostKeyChecking=no' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' \"/etc/spark/conf\" \"52.53.200.0:/opt/spark/spark-1.6.2-bin-hadoop2.4/conf\"", "failed": true, "item": "52.53.220.119", "msg": "Warning: Identity file /home/peng/.ssh/saphana.pem not accessible: No such file or directory.\nPermission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: unexplained error (code 255) at io.c(226) [sender=3.1.0]\n", "rc": 255}
ok: [52.53.200.193 -> 52.53.200.193] => (item=52.53.200.193)
failed: [52.53.220.119 -> 52.53.200.193] (item=52.53.200.193) => {"cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/peng/.ssh/saphana.pem -S none -o StrictHostKeyChecking=no' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' \"/etc/spark/conf\" \"52.53.220.119:/opt/spark/spark-1.6.2-bin-hadoop2.4/conf\"", "failed": true, "item": "52.53.200.193", "msg": "Warning: Identity file /home/peng/.ssh/saphana.pem not accessible: No such file or directory.\nPermission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: error in rsync protocol data stream (code 12) at io.c(226) [sender=3.1.0]\n", "rc": 12}
failed: [52.53.200.0 -> 52.53.200.193] (item=52.53.200.193) => {"cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/peng/.ssh/saphana.pem -S none -o StrictHostKeyChecking=no' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' \"/etc/spark/conf\" \"52.53.200.0:/opt/spark/spark-1.6.2-bin-hadoop2.4/conf\"", "failed": true, "item": "52.53.200.193", "msg": "Warning: Identity file /home/peng/.ssh/saphana.pem not accessible: No such file or directory.\nPermission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: unexplained error (code 255) at io.c(226) [sender=3.1.0]\n", "rc": 255}

显然,似乎ansible试图在我的所有3个主机之间创建交换对并在每对之间进行同步(因此执行了9个rsync),如何避免这种情况并命令ansible仅在本地执行rsync?

更新:我已将任务定义更改为使用delegate.host:

- name: Sync /etc/spark/conf to $SPARK_HOME/conf
  synchronize: src=/etc/spark/conf dest={{spark_home}}/conf
  delegate_to: delegate.host
  tags: spark

但是ansible引擎显然没有正确解释它,调试日志显示它没有被主机IP地址替代:

  

为用户建立SSH连接:

     

SSH:EXEC ssh -C -q -o ControlMaster = auto -o   ControlPersist = 60s -o KbdInteractiveAuthentication = no -o   PreferredAuthentications = GSSAPI与 - 麦克风,GSSAPI-keyex,基于主机的,公钥   -o PasswordAuthentication = no -o ConnectTimeout = 10 -o ControlPath = / home / peng / .ansible / cp / ansible-ssh-%h-%p-%r    delegate.host &#39; / bin / sh -c&#39;&#34;&#39;&#34;&#39;(umask 77&amp;&amp; mkdir -p &#34; echo $HOME/.ansible/tmp/ansible-tmp-1470667606.38-157157938048153&#34;&amp;&amp;   echo ansible-tmp-1470667606.38-157157938048153 =&#34; echo $HOME/.ansible/tmp/ansible-tmp-1470667606.38-157157938048153&#34; )&amp;&amp;   睡0&#39;&#39;&#39;&#39;&#39;

这看起来像是一个已弃用的功能,我使用的是ansible 2.1.0.0

1 个答案:

答案 0 :(得分:2)

解决:

var gridView1Control = document.getElementById('<%= GridView1.ClientID %>');

$('#<%= btnGetData.ClientID %>').click(function(e) {
    //To uncheck the header checkbox when there are no selected checkboxes in itemtemplate
    $('input:checkbox[id$=CheckSelect]:checked', gridView1Control).each(function(item, index) {

        var id = $(this).next('input:hidden[id$=hdID]').val();
        alert(id);
    });
    return false;

});

可能会删除delegate.host以支持新变量