无法在hadoop中使用dfs创建目录

时间:2014-10-05 23:20:44

标签: java android apache ubuntu hadoop

我正在尝试使用以下命令创建文件夹

manoj@ubuntu:/usr/local/hadoop/bin$ hadoop dfs -mkdir /tmp

然而,我遇到以下错误:

mkdir: unknown host: hadoop

我已发布日志文件,并希望得到一些帮助。我在hadoop上安装了单节点。它看起来像java unknownhostexception错误。请让我知道该怎么做才能纠正这个问题。

manoj@ubuntu:/usr/local/hadoop/logs$ cat hadoop-manoj-datanode-ubuntu.log
2014-10-05 13:08:30,621 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = ubuntu/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.2.0
STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1479473; compiled by 'hortonfo' on Mon May  6 06:59:37 UTC 2013
STARTUP_MSG:   java = 1.7.0_65
************************************************************/
2014-10-05 13:08:32,449 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2014-10-05 13:08:32,514 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source MetricsSystem,sub=Stats registered.
2014-10-05 13:08:32,519 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2014-10-05 13:08:32,519 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
2014-10-05 13:08:34,173 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi registered.
2014-10-05 13:08:34,191 WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already exists!
2014-10-05 13:08:36,439 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.net.UnknownHostException: unknown host: hadoop
    at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:233)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1233)
    at org.apache.hadoop.ipc.Client.call(Client.java:1087)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
    at com.sun.proxy.$Proxy5.getProtocolVersion(Unknown Source)
    at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:414)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:392)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:374)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:453)
    at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:335)
    at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:300)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:383)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:319)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1698)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1637)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1655)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1781)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1798)

2014-10-05 13:08:36,443 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at ubuntu/127.0.1.1
************************************************************/

2 个答案:

答案 0 :(得分:0)

看起来像配置问题。 我假设您使用的是最新版本的Hadoop - 如果是这种情况,则应使用hdfs命令。因此,请尝试bin/hdfs dfs -ls查看您的fs命令是否有效。 我猜他们不会工作。在这种情况下,您应该检查您的core-site.xml以获取HDFS设置(fs.defaultFS)。

答案 1 :(得分:0)

转到$ HADOOP_HOME并尝试$bin/hadoop fs -mkdir /tmp