java.lang.ClassNotFoundException:类org.apache.hadoop.hdfs.DistributedFileSystem

时间:2018-04-12 14:30:58

标签: java eclipse hadoop

我想使用hadoop-3.0运行以下程序:

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;


public class HDFSFileTest {
public static void main(String[]args) {
    try {
        String fileName="input/test.txt";
        Configuration conf =new Configuration();
        conf.set("fs.defaultFS","hdfs://localhost:9000");
    conf.set("fs.hdfs.impl","org.apache.hadoop.hdfs.DistributedFileSystem");
        FileSystem fs=FileSystem.get(conf);
        if(fs.exists(new Path(fileName))) {
            System.out.println("File exists!");
        }
        else {
            System.out.println("File not exists!");
        }
    }
    catch(Exception e){
        e.printStackTrace();
    }
}
}

但是当我在eclipse中执行代码时,我得到了异常:

log4j:WARN No appenders could be found for logger 
(org.apache.hadoop.util.Shell).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for 
more info.
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class 
org.apache.hadoop.hdfs.DistributedFileSystem not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2559)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3254)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3286)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:123)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3337)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3305)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:476)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:225)
at HDFSFileTest.main(HDFSFileTest.java:13)
Caused by: java.lang.ClassNotFoundException: Class 
org.apache.hadoop.hdfs.DistributedFileSystem not found
at 

我已经检查了包hadoop-common-3.0.1.jar,hadoop-hdfs-3.0.1.jar,并且没有这样的类叫

 org.apache.hadoop.hdfs.DistributedFileSystem

4 个答案:

答案 0 :(得分:0)

答案 1 :(得分:0)

您可以下载hadoop版本的客户端核心jar并尝试再次运行该代码。这有时是由于jar丢失问题而发生的。

第二件事是“fs.hdfs.impl”已被弃用。

您可以查看: - https://community.hortonworks.com/questions/32800/where-can-i-find-fshdfsimpl-property.html

答案 2 :(得分:0)

也许你们只需要添加一些其他的罐子即可。

我之前也遇到过同样的问题,然后我碰巧添加了其他一些罐子,如下所示。幸运的是,我成功了。

enter image description here

答案 3 :(得分:0)

我需要下载(gradle格式):

implementation 'org.apache.hadoop:hadoop-common:3.2.1'
implementation 'org.apache.hadoop:hadoop-hdfs:3.2.1'
implementation 'org.apache.hadoop:hadoop-hdfs-client:3.2.1'