Hadoop 2.7.1 我有一个Eclipse(maven)项目。 我能够运行wordcount hadoop示例,因此hadoop已正确配置。 如果我尝试在运行时实例化一个使用Apache Jena的Model类的对象,则抛出以下错误:
Exception in thread "main" java.lang.NoClassDefFoundError:
com/hp/hpl/jena/rdf/model/Model
at hadoop.wordcount.WordCount.main(WordCount.java:20)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: com.hp.hpl.jena.rdf.model.Model
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
同一个对象(使用Jena的对象)在独立(没有hadoop)项目中工作。 当我尝试运行stats hadoop示例时会发生同样的错误。 要创建jar,我运行:
mvn clean package
似乎如果我尝试使用其他库中的类,那些类将不会“包含”在生成的jar中。 哪里我错了?有什么建议?现在我没有想法了!
编辑#1:
我尝试编译:
mvn clean compile assembly:single
使用此程序集配置:
<assembly>
<id>hadoop-job</id>
<formats>
<format>jar</format>
</formats>
<includeBaseDirectory>false</includeBaseDirectory>
<dependencySets>
<dependencySet>
<unpack>false</unpack>
<scope>provided</scope>
<outputDirectory>lib</outputDirectory>
<excludes>
<exclude>${groupId}:${artifactId}</exclude>
</excludes>
</dependencySet>
<dependencySet>
<unpack>true</unpack>
<includes>
<include>${groupId}:${artifactId}</include>
</includes>
</dependencySet>
</dependencySets>
</assembly>
但我仍面临同样的问题。
编辑#2:编辑#2: 在我的情况下,这工作:在pom.xml中包含此插件
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<configuration>
<shadedArtifactAttached>true</shadedArtifactAttached>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
<exclude>META-INF/LICENSE*</exclude>
<exclude>license/*</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
然后
mvn clean compile package
(来自@Garry的解决方案:Hadoop java.io.IOException: Mkdirs failed to create /some/path)
现在它可行,但在“解压缩”过程中需要花费大量时间。任何已知的解决方法?