将文件从HDFS复制到本地计算机

时间:2012-07-13 13:09:44

标签: java hadoop hdfs

我遇到了尝试从HDFS文件系统“下载”文件到本地系统的问题。 (即使相反的操作没有问题)。 *注意:文件存在于指定路径上的HDFS文件系统上

以下是代码段:

    Configuration conf = new Configuration();
    conf.set("fs.defaultFS", "${NAMENODE_URI}");
    FileSystem hdfsFileSystem = FileSystem.get(conf);

    String result = "";

    Path local = new Path("${SOME_LOCAL_PATH}");
    Path hdfs = new Path("${SOME_HDFS_PATH}");

    String fileName = hdfs.getName();

    if (hdfsFileSystem.exists(hdfs))
    {
        hdfsFileSystem.copyToLocalFile(hdfs, local);
        result = "File " + fileName + " copied to local machine on location: " + localPath;
    }
    else
    {
        result = "File " + fileName + " does not exist on HDFS on location: " + localPath;
    }

    return result;

我得到的例外情况如下:

12/07/13 14:57:46 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.io.IOException: Cannot run program "cygpath": CreateProcess error=2, The system cannot find the file specified
    at java.lang.ProcessBuilder.start(Unknown Source)
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:206)
    at org.apache.hadoop.util.Shell.run(Shell.java:188)
    at org.apache.hadoop.fs.FileUtil$CygPathCommand.<init>(FileUtil.java:412)
    at org.apache.hadoop.fs.FileUtil.makeShellPath(FileUtil.java:438)
    at org.apache.hadoop.fs.FileUtil.makeShellPath(FileUtil.java:465)
    at org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:573)
    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:565)
    at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:403)
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:452)
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:420)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:774)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:755)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:654)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:259)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:232)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:183)
    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1837)
    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1806)
    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1782)
    at com.hmeter.hadoop.hdfs.hdfsoperations.HdfsOperations.fileCopyFromHdfsToLocal(HdfsOperations.java:75)
    at com.hmeter.hadoop.hdfs.hdfsoperations.HdfsOperations.main(HdfsOperations.java:148)
Caused by: java.io.IOException: CreateProcess error=2, The system cannot find the file specified
    at java.lang.ProcessImpl.create(Native Method)
    at java.lang.ProcessImpl.<init>(Unknown Source)
    at java.lang.ProcessImpl.start(Unknown Source)
    ... 22 more

知道可能是什么问题吗?为什么要求Cygwin的cyqpath?我在Windows 7上运行此代码。

由于

2 个答案:

答案 0 :(得分:10)

尝试使用API​​中的此方法:

//where delSrc is do you want to delete the source, src and dst you already have and useRawLocalFileSystem should be set to true in your case
hdfsFileSystem.copyToLocalFile(delSrc, src, dst, useRawLocalFileSystem);

在您的情况下替换:

hdfsFileSystem.copyToLocalFile(hdfs, local);

使用:

hdfsFileSystem.copyToLocalFile(false, hdfs, local, true);

答案 1 :(得分:6)

您可以按照以下代码进行操作:

public static void main(String args[]){
    try {
        Configuration conf = new Configuration();
        conf.set("fs.defaultFS", "hdfs://localhost:54310/user/hadoop/");
        FileSystem fs = FileSystem.get(conf);
        FileStatus[] status = fs.listStatus(new Path("hdfsdirectory"));
        for(int i=0;i<status.length;i++){
            System.out.println(status[i].getPath());
            fs.copyToLocalFile(false, status[i].getPath(), new Path("localdir"));
        }
    } catch (IOException e) {
        e.printStackTrace();
    }

}