连接到hadoop dfs时发生内部错误

时间:2015-03-25 09:45:57

标签: hadoop eclipse-plugin mapreduce

我为hadoop-2.3.0构建了一个eclipse插件。 Bundle-classpath是

Bundle-classpath: classes/, 
lib/hadoop-mapreduce-client-core-${hadoop.version}.jar,
lib/hadoop-mapreduce-client-common-${hadoop.version}.jar,
lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar,
lib/hadoop-auth-${hadoop.version}.jar,
lib/hadoop-common-${hadoop.version}.jar,
lib/hadoop-hdfs-${hadoop.version}.jar,
lib/protobuf-java-${protobuf.version}.jar,
lib/log4j-${log4j.version}.jar,
lib/commons-cli-1.2.jar,
lib/commons-configuration-1.6.jar,
lib/commons-httpclient-3.1.jar,
lib/commons-lang-2.5.jar,
lib/commons-collections-${commons-collections.version}.jar,
lib/jackson-core-asl-1.8.8.jar,
lib/jackson-mapper-asl-1.8.8.jar,
lib/slf4j-log4j12-1.7.5.jar,
lib/slf4j-api-1.7.5.jar,
lib/guava-${guava.version}.jar,
lib/netty-${netty.version}.jar

我在eclipse \ plugins中添加了构建的jar文件hadoop-eclipse-plugin.jar。我正在使用Eclipse kepler sr2包。在尝试创建hdfs并连接..时,正在生成内部错误 - >

An internal error occurred during: "Map/Reduce location status updater". org/apache/commons/lang/StringUtils

可能导致此错误的原因以及解决方法如何?任何帮助都是适当的。

0 个答案:

没有答案