我看到很多关于这个问题的引用,我已经发布了here
我正在使用Hadoop 2.4.1和Flume 1.5.0.1。我的flume-env.sh配置如下
FLUME_CLASSPATH="/var/lib/apache-flume-ng:lib/hadoop-core-1.2.0.jar:lib/hadoop-auth-2.4.1.jar:lib/hadoop-yarn-api-2.4.1.jar:lib/hadoop-mapreduce-client-jobclient-2.4.1.jar:lib/hadoop-mapreduce-client-core-2.4.1.jar:lib/hadoop-common-2.4.1.jar:lib/hadoop-annotations-2.4.1.jar"
使用这些罐子我已经在Flume的lib中添加了另外一个jar,它是commons-configuration-1.6.jar。我是Flume和Hadoop的新手。
完整跟踪如下:
ERROR [conf-file-poller-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:149) - Unhandled error
java.lang.NoSuchFieldError: IBM_JAVA
at org.apache.hadoop.security.UserGroupInformation.getOSLoginModuleName(UserGroupInformation.java:337)
at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:382)
at org.apache.flume.sink.hdfs.HDFSEventSink.authenticate(HDFSEventSink.java:553)
at org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:272)
at org.apache.flume.conf.Configurables.configure(Configurables.java:41)
at org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:418)
at org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:103)
at org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:140)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
答案 0 :(得分:4)
问题是由缺少或冲突的依赖项引起的:
这将解决问题。
答案 1 :(得分:-1)
最后我找到了答案。
从hadoop-auth复制PlatformName类并在本地自定义编译。
package org.apache.hadoop.util;
公共类PlatformName {
private static final String platformName = System.getProperty("os.name") + "-" + System.getProperty("os.arch") + "-" + System.getProperty("sun.arch.data.model");
public static final String JAVA_VENDOR_NAME = System.getProperty("java.vendor");
public static final boolean IBM_JAVA = JAVA_VENDOR_NAME.contains("IBM");
public static String getPlatformName() {
return platformName;
}
public static void main(String[] args) {
System.out.println(platformName);
}
}
将类文件复制粘贴到hadoop-core中。你应该开始运作。
感谢所有