无法将输入文件保存在hadoop中

时间:2013-07-27 08:23:01

标签: file hadoop

我正在尝试使用

在hadoop文件系统中保存sample.txt文件
./softwares/hadoop/bin/hadoop dfs -put sample.txt /input

它没有显示任何错误,但是当我在/ input文件夹中看到sample.txt没有找到时。我只能看到 / input folder

./softwares/hadoop-1.2.0/bin/hadoop dfs -ls /input
Found 1 items
-rw-r--r--   1 admin supergroup         11 2013-07-27 13:32 /input

在日志文件中

2013-07-27 13:27:34,779 INFO org.apache.hadoop.http.HttpServer: Port returned by webServer.getConnectors()[0].getLocalPort() before open() is -1. Opening the listener on 50060
2013-07-27 13:27:34,780 INFO org.apache.hadoop.http.HttpServer: listener.getLocalPort() 
returned 50060 webServer.getConnectors()[0].getLocalPort() returned 50060
2013-07-27 13:27:34,780 INFO org.apache.hadoop.http.HttpServer: Jetty bound to port 50060
2013-07-27 13:27:34,780 INFO org.mortbay.log: jetty-6.1.26
2013-07-27 13:27:35,174 INFO org.mortbay.log: Started SelectChannelConnector@0.0.0.0:50060
2013-07-27 13:27:35,175 INFO org.apache.hadoop.mapred.TaskTracker: FILE_CACHE_SIZE for 
mapOutputServlet set to : 2000
2013-07-27 13:27:35,485 INFO org.apache.hadoop.mapred.TaskTracker: Failed to get system directory...
2013-07-27 13:27:37,348 INFO org.apache.hadoop.mapred.TaskTracker: Failed to get system directory...
2013-07-27 13:27:41,050 INFO org.apache.hadoop.mapred.TaskTracker: Failed to get system directory...
2013-07-27 13:27:44,319 INFO org.apache.hadoop.mapred.TaskTracker: Failed to get system directory...
2013-07-27 13:27:45,951 INFO org.apache.hadoop.mapred.TaskTracker: Failed to get system directory...
2013-07-27 13:27:48,311 INFO org.apache.hadoop.mapred.TaskTracker: Failed to get system directory...
2013-07-27 13:27:52,346 INFO org.apache.hadoop.mapred.TaskTracker: Failed to get system directory...
2013-07-27 13:27:56,363 INFO org.apache.hadoop.mapred.TaskTracker: Failed to get system directory...
2013-07-27 13:27:57,619 INFO org.apache.hadoop.mapred.TaskTracker: Failed to get system directory...

任何人都可以帮我解决这个问题。

1 个答案:

答案 0 :(得分:2)

请提供input.txt文件的完整位置。如果它在您当前的主目录中,请使用此类文件。

[hostname]$hadoop dfs -put ./sample.txt /user/input/