无法使用Pail DFS创建文件

时间:2015-06-13 22:38:52

标签: hdfs lambda-architecture

新手在这里。尝试使用Pail运行Nathan Marz的大数据DFS数据存储区中的代码。我究竟做错了什么?尝试连接到HDFS VM。尝试用文件替换hdfs。任何帮助表示赞赏。

public class AppTest
{
    private App app = new App();
    private String path = "hdfs:////192.168.0.101:8080/mypail";

    @Before
    public void init() throws IllegalArgumentException, IOException{
        FileSystem fs = FileSystem.get(new Configuration());
        fs.delete(new Path(path), true);
    }

    @Test public    void testAppAccess() throws IOException{
        Pail pail = Pail.create(path);
          TypedRecordOutputStream os = pail.openWrite();
          os.writeObject(new byte[] {1, 2, 3});
          os.writeObject(new byte[] {1, 2, 3, 4});
          os.writeObject(new byte[] {1, 2, 3, 4, 5});
          os.close();   
   }
}

收到错误 -

java.lang.IllegalArgumentException: Wrong FS: hdfs:/192.168.0.101:8080/mypail, expected: file:///
    at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645)
    at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:80)
    at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:529)
    at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:747)

用文件替换HDFS作为file:///

java.io.IOException: Mkdirs failed to create file:/192.168.0.101:8080/mypail (exists=false, cwd=file:/Users/joshi/git/projectcsr/projectcsr)
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:442)
    at 

1 个答案:

答案 0 :(得分:0)

我遇到了同样的问题,我解决了!您应该将core-site.xml添加到hadoop Configuration对象中,这样的内容应该有效:

Configuration cfg = new Configuration();
Path core_site_path = new Path("path/to/your/core-site.xml");
cfg.addResource(core_site_path);
FileSystem fs = FileSystem.get(cfg);

我想您也可以通过编程方式将属性fs.defaultFS添加到cfg对象

来源: http://opensourceconnections.com/blog/2013/03/24/hdfs-debugging-wrong-fs-expected-file-exception/