Flume log4j Appender

时间:2015-11-25 13:12:42

标签: hadoop hdfs flume

我试图配置flume将hadoop服务日志写入公共接收器。

这是我添加到hdfs log4j.properties

的内容
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hadoop.root.logger}, flume

#Flume Appender
log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname = localhost
log4j.appender.flume.Port = 41414

当我运行示例pi作业时,我收到此错误

$ hadoop jar hadoop-mapreduce-examples.jar pi 10 10 
log4j:ERROR Could not find value for key log4j.appender.flume.layout 15/11/25 07:23:26 WARN api.NettyAvroRpcClient: Using default maxIOWorkers log4j:ERROR RPC client creation failed! NettyAvroRpcClient { host: localhost, port: 41414 }: RPC connection error Exception in thread "main" java.lang.ExceptionInInitializerError
            at org.apache.hadoop.util.RunJar.run(RunJar.java:200)
            at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: org.apache.commons.logging.LogConfigurationException: User-specified log class 'org.apache.commons.logging.impl.Log4JLogger' cannot be found or is not useable.
            at org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:804)
            at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)
            at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292)
            at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269)
            at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
            at org.apache.hadoop.util.ShutdownHookManager.<clinit>(ShutdownHookManager.java:44)
            ... 2 more

我已将这些罐子添加到hadoop-hdfs lib

avro-ipc-1.7.3.jar,flume-ng-log4jappender-1.5.2.2.2.7.1-33.jar,flume-ng-sdk-1.5.2.2.2.7.1-33.jar

我确实有hdfs lib中存在的commons-logging(commons-logging-1.1.3.jar)和log4j(1.2.17)jar。有没有调试这个问题的指针?

0 个答案:

没有答案