使用java从e​​clipse访问HDFS

时间:2018-04-15 16:05:06

标签: java hadoop hdfs hadoop2

enter image description here以下是访问HDFS的代码

package myDefaultPackage;
     

import java.io. ; import org.apache.hadoop.fs。;进口   org.apache.hadoop.conf *;

     

公共类Testing_HDFS_File {

public static void main(String [] args) throws Exception {
    try {

      Configuration config = new Configuration();
      config.set("fs.defaultFS","hdfs://192.168.28.153:9000/");
      FileSystem dfs = FileSystem.get(config);
        Path pt = new Path("hdfs://192.168.28.153:9000/user/hduser/wordcountinput/input.txt");
        config.addResource(new Path("/usr/local/hadoop/conf/core-site.xml"));
        BufferedReader br = new BufferedReader(new InputStreamReader(dfs.open(pt)));
        String line;
        line = br.readLine();
        while ((line = br.readLine()) != null) {
            System.out.println(line);
            line = br.readLine();
        }
        br.close();
    }
    catch (Exception e) {
        System.out.println(e.getMessage());
        e.printStackTrace();
    }
}
     

}

我得到了这个例外:

WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the
     

log4j.properties文件。没有FileSystem for scheme:   hdfsjava.io.IOException:没有用于scheme的文件系统:hdfs

     

在   org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2138)     在   org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2145)     在org.apache.hadoop.fs.FileSystem.access $ 200(FileSystem.java:80)at   org.apache.hadoop.fs.FileSystem $ Cache.getInternal(FileSystem.java:2184)     在org.apache.hadoop.fs.FileSystem $ Cache.get(FileSystem.java:2166)     在org.apache.hadoop.fs.FileSystem.get(FileSystem.java:302)at   org.apache.hadoop.fs.FileSystem.get(FileSystem.java:158)at at   myDefaultPackage.Testing_HDFS_File.main(Testing_HDFS_File.java:15)

1 个答案:

答案 0 :(得分:2)

只需使用特定的罐子即可。有很多不合适的罐子。