我在ubuntu 15.04中安装了hadoop 2.7.1。 我想将hadoop中的文件复制到我创建的输入文件夹中并使用命令:
$ mkdir input
(在hadoop_dev中创建输入目录)
$ cp etc/hadoop/*.xml input
(将所有xml文件复制到输入文件夹)
但它会出错:cp: target ‘input’ is not a directory
感谢。
答案 0 :(得分:0)
如果您尝试将配置文件从本地文件系统复制到HDFS,请尝试以下操作:
1。在HDFS中创建目录:
hdfs dfs -mkdir /input
2. 将文件复制到HDFS:
hdfs dfs -put /etc/hadoop/*.xml /input/
更新1:
在/home/hadoopuser/.bashrc中导出hadoop命令(导出hdfs命令)
export HADOOP_HOME=/path/to/hadoop/folder
export PATH=$PATH:$HADOOP_HOME/bin
答案 1 :(得分:0)
Hadoop文件系统(FS)shell包括各种类似shell的命令,它们直接与Hadoop分布式文件系统(HDFS)以及Hadoop支持的其他文件系统交互,例如Local FS,HFTP FS,S3 FS(亚马逊),AZURE BLOB(Micorsoft Azure Blob)等。
参考hadoop命令guide,其中包含更多信息,我包括有关应该用于满足您的要求的hadoop fs命令的详细信息,其他人可以参考指南。
copyFromLocal
Usage: hadoop fs -copyFromLocal <localsrc> URI
Similar to put command, except that the source is restricted to a local file reference.
Options:
The -f option will overwrite the destination if it already exists.
mkdir
Usage: hadoop fs -mkdir [-p] <paths>
Takes path uri’s as argument and creates directories.
Options:
The -p option behavior is much like Unix mkdir -p, creating parent directories along the path.
Example:
hadoop fs -mkdir /user/hadoop/dir1 /user/hadoop/dir2
hadoop fs -mkdir hdfs://nn1.example.com/user/hadoop/dir hdfs://nn2.example.com/user/hadoop/dir
Exit Code:
Returns 0 on success and -1 on error.