Hadoop |本地主机与目标主机不同

时间:2019-10-20 15:20:36

标签: hadoop ssh localhost

我正在尝试在本地计算机上设置Hadoop,但我对此非常困惑。

▶ hadoop fs -mkdir /home/hadoop
mkdir: Failed on local exception: org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length; Host Details : local host is: "aleph-pc.local/192.168.1.129"; destination host is: "aleph-pc":8020;

我认为这与我配置sshd的方式或fs.default.name中的值有关,并且它与其他question密切相关。我可以从这里去哪里?我将不胜感激。

▶ cat /etc/hosts
127.0.0.1 localhost
::1 localhost

▶ hadoop version
Hadoop 2.9.2
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r 826afbeae31ca687bc2f8471dc841b66ed2c6704
Compiled by ajisaka on 2018-11-13T12:42Z
Compiled with protoc 2.5.0
From source with checksum 3a9939967262218aa556c684d107985
This command was run using /home/aleph/Documents/Projects/Hadoop/hadoop-2.9.2/share/hadoop/common/hadoop-common-2.9.2.jar

▶ tail hadoop-2.9.2/etc/hadoop/core-site.xml 
-->

<!-- Put site-specific property overrides in this file. -->

<configuration>
    <property>                                                                                    
        <name>fs.default.name</name>                                                              
        <value>hdfs://aleph-pc/127.0.0.1</value>
    </property>      


</configuration>

▶ jps
17619 Main
32133 NodeManager
32037 ResourceManager
31879 SecondaryNameNode
17816 RemoteMavenServer36
1020 Jps
31678 DataNode

0 个答案:

没有答案