我知道有很多关于此例外的帖子,但我无法解决此问题。必须编辑类路径我想解决它。 我试图在hadoop基础设施中运行一个名为DistMap的程序。这是我得到的错误。
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.PlatformName
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.util.PlatformName. Program will exit.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FsShell
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FsShell
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.fs.FsShell. Program will exit.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.PlatformName
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.util.PlatformName. Program will exit.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FsShell
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FsShell
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.fs.FsShell. Program will exit.
Error could not create input directory /distmap_output_input folder on hdfs file system
哪个java说
/usr/java/jdk1.6.0_32/bin/java
echo $ CLASSPATH给出一个空行
cat~ / .bash_profile说
cat ~/.bash_profile
# .bash_profile
# Get the aliases and functions
if [ -f ~/.bashrc ]; then
. ~/.bashrc
fi
# User specific environment and startup programs
PATH=$PATH:$HOME/bin
export PATH
更新
$ HADOOP_HOME /usr/lib/hadoop
$ HADOOP_CLASSPATH
/usr/lib/hadoop-0.20-mapreduce/hadoop-ant-2.0.0-mr1-cdh4.4.0.jar:/usr/lib/hadoop-0.20-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-0.20-mapreduce/hadoop-examples-2.0.0-mr1-cdh4.4.0.jar:/usr/lib/hadoop-0.20-mapreduce/hadoop-examples.jar:/usr/lib/hadoop-0.20-mapreduce/hadoop-core.jar:/usr/lib/hadoop-0.20-mapreduce/hadoop-core-2.0.0-mr1-cdh4.4.0.jar:/usr/lib/hadoop-0.20-mapreduce/hadoop-test-2.0.0-mr1-cdh4.4.0.jar:/usr/lib/hadoop-0.20-mapreduce/hadoop-test.jar:/usr/lib/hadoop-0.20-mapreduce/hadoop-tools-2.0.0-mr1-cdh4.4.0.jar:/usr/lib/hadoop-0.20-mapreduce/hadoop-tools.jar:/usr/lib/hadoop-0.20-mapreduce/lib/*jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-common-2.0.0-cdh4.4.0.jar
最后2个jar文件包含PlatformName和FsShell类。仍然没有用。
任何人都可以帮我解决这个问题吗?
由于
答案 0 :(得分:18)
答案 1 :(得分:3)
如果您是maven用户并且遇到此问题 -
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.client.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.client.version}</version>
</dependency>
注意:仅:hadoop-client:2.5.2
并未引入所有必需的hadoop依赖项。这就是我添加hadoop-common
的原因,它带来了所有必需的代码。
答案 2 :(得分:0)
答案 3 :(得分:0)
在我的特定情况下,我正在使用Maven,尽管我在POM和类路径中都进行了hadoop-auth,但仍然遇到此异常。
最终为我解决的问题是将POM中的依赖范围从“提供”更改为“编译”。