Hadoop sshfence(权限被拒绝)

时间:2014-01-21 08:54:45

标签: hadoop hdfs high-availability

我正在为HDFS尝试Hadoop 2高可用性。我在用户hafence下的NameNodes之间设置了无密码的ssh连接。我验证了 - 它的确有效。但是,当使用此sshfence设置时,我正在关注(Permission Denied)。

2014-01-20 12:54:47,101 INFO  ha.NodeFencer (NodeFencer.java:fence(91)) - ====== Beginning Service Fencing Process... ======
2014-01-20 12:54:47,101 INFO  ha.NodeFencer (NodeFencer.java:fence(94)) - Trying method 1/1: org.apache.hadoop.ha.SshFenceByTcpPort(hafence:22)
2014-01-20 12:54:47,101 WARN  ha.SshFenceByTcpPort (SshFenceByTcpPort.java:tryFence(93)) - Unable to create SSH session
com.jcraft.jsch.JSchException: java.io.FileNotFoundException: /home/hafence/.ssh/id_rsa (Permission denied)
    at com.jcraft.jsch.IdentityFile.newInstance(IdentityFile.java:98)
    at com.jcraft.jsch.JSch.addIdentity(JSch.java:206)
    at com.jcraft.jsch.JSch.addIdentity(JSch.java:192)
    at org.apache.hadoop.ha.SshFenceByTcpPort.createSession(SshFenceByTcpPort.java:122)
    at org.apache.hadoop.ha.SshFenceByTcpPort.tryFence(SshFenceByTcpPort.java:91)
    at org.apache.hadoop.ha.NodeFencer.fence(NodeFencer.java:97)
    at org.apache.hadoop.ha.ZKFailoverController.doFence(ZKFailoverController.java:521)
    at org.apache.hadoop.ha.ZKFailoverController.fenceOldActive(ZKFailoverController.java:494)
    at org.apache.hadoop.ha.ZKFailoverController.access$1100(ZKFailoverController.java:59)
    at org.apache.hadoop.ha.ZKFailoverController$ElectorCallbacks.fenceOldActive(ZKFailoverController.java:837)
    at org.apache.hadoop.ha.ActiveStandbyElector.fenceOldActive(ActiveStandbyElector.java:900)
    at org.apache.hadoop.ha.ActiveStandbyElector.becomeActive(ActiveStandbyElector.java:799)
    at org.apache.hadoop.ha.ActiveStandbyElector.processResult(ActiveStandbyElector.java:415)
    at org.apache.zookeeper.ClientCnxn$EventThread.processEvent(ClientCnxn.java:596)
    at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:495)
Caused by: java.io.FileNotFoundException: /home/hafence/.ssh/id_rsa (Permission denied)
    at java.io.FileInputStream.open(Native Method)
    at java.io.FileInputStream.<init>(FileInputStream.java:138)
    at java.io.FileInputStream.<init>(FileInputStream.java:97)
    at com.jcraft.jsch.IdentityFile.newInstance(IdentityFile.java:83)
    ... 14 more
2014-01-20 12:54:47,102 WARN  ha.NodeFencer (NodeFencer.java:fence(108)) - Fencing method org.apache.hadoop.ha.SshFenceByTcpPort(hafence:22) was unsuccessful.
2014-01-20 12:54:47,102 ERROR ha.NodeFencer (NodeFencer.java:fence(111)) - Unable to fence service by any configured method.
2014-01-20 12:54:47,102 WARN  ha.ActiveStandbyElector (ActiveStandbyElector.java:becomeActive(807)) - Exception handling the winning of election

我的配置如下hdfs-site.xml:

  <property>
    <name>dfs.ha.fencing.methods</name>
    <value>sshfence(hafence:22)</value>
  </property>
  <property>
    <name>dfs.ha.fencing.ssh.private-key-files</name>
    <value>/home/hafence/.ssh/id_rsa</value>
  </property>
  <property>
    <name>dfs.ha.fencing.ssh.connect-timeout</name>
    <value>30000</value>
  </property>

  <property>
    <name>dfs.ha.automatic-failover.enabled</name>
    <value>true</value>
  </property>

为了使sshfence正常工作,有哪些权限要求,用户等?

2 个答案:

答案 0 :(得分:1)

因为hadoop是从RPM软件包安装的。我们使用hdfs用户(包括密码少auth)用于sshfence,然后sshfence开始工作。

答案 1 :(得分:0)

检查/home/hafence/.ssh/id_rsa上的权限。我有这个问题。检查id_rsa的权限。您的用户应具有读写权限。